Compare commits
382 Commits
3.8.1
...
json-ocsf-
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
9c9d270053 | ||
|
|
f7fab165ba | ||
|
|
93bdf43c95 | ||
|
|
b3866b5b71 | ||
|
|
2308084dee | ||
|
|
6eb5496c27 | ||
|
|
c5514fdb63 | ||
|
|
c78c3058fd | ||
|
|
10d9ef9906 | ||
|
|
43426041ef | ||
|
|
125eb9ac53 | ||
|
|
681407e0a2 | ||
|
|
082f3a8fe8 | ||
|
|
397cc26b2a | ||
|
|
331ae92843 | ||
|
|
06843cd41a | ||
|
|
28b5ef9ee9 | ||
|
|
63dcc057d3 | ||
|
|
0bc16ee5ff | ||
|
|
abcc9c2c80 | ||
|
|
daf2ad38bd | ||
|
|
3dc418df39 | ||
|
|
00aaafbc12 | ||
|
|
bd49a55f3d | ||
|
|
013975b7a6 | ||
|
|
392026286a | ||
|
|
29ef974565 | ||
|
|
06c8216092 | ||
|
|
03f04d24a5 | ||
|
|
7b45ed63cc | ||
|
|
6e4dd1d69c | ||
|
|
185b4cba0c | ||
|
|
8198ea4a2c | ||
|
|
aaf3e8a5cf | ||
|
|
ecef56fa8f | ||
|
|
349ce3f2d0 | ||
|
|
e3d4741213 | ||
|
|
9d6d5f1d76 | ||
|
|
3152d67f58 | ||
|
|
cb41c8d15b | ||
|
|
06590842d6 | ||
|
|
d4c22a0ca5 | ||
|
|
c6f9936292 | ||
|
|
eaa8900758 | ||
|
|
e1e95d8879 | ||
|
|
ef3a0f4878 | ||
|
|
64cc36e7e2 | ||
|
|
1e001bb0fd | ||
|
|
6ba123a003 | ||
|
|
36d0f2c23f | ||
|
|
63412e3645 | ||
|
|
191cf276c3 | ||
|
|
45978bd0bb | ||
|
|
9666652d18 | ||
|
|
ad2716d7c9 | ||
|
|
0a7939bea3 | ||
|
|
b8c50a7b45 | ||
|
|
175e8d2b05 | ||
|
|
046069a656 | ||
|
|
f9522da48f | ||
|
|
c03f959005 | ||
|
|
522aeebe5e | ||
|
|
5312f487f9 | ||
|
|
d9b6624d65 | ||
|
|
1506da54fc | ||
|
|
245512d320 | ||
|
|
487190b379 | ||
|
|
74aaeaa95c | ||
|
|
28e8f0de2b | ||
|
|
f60b5017e2 | ||
|
|
fe80821596 | ||
|
|
628a3c4e7b | ||
|
|
3d59c34ec9 | ||
|
|
35043c2dd6 | ||
|
|
ab815123c9 | ||
|
|
69ab84efe1 | ||
|
|
77823afa54 | ||
|
|
63cd6c1290 | ||
|
|
cab32d2f94 | ||
|
|
1f4316e9dd | ||
|
|
ade762a85e | ||
|
|
bda5d62c72 | ||
|
|
2176fff8c3 | ||
|
|
87893bd54b | ||
|
|
b539a888b1 | ||
|
|
d6b2b0ca13 | ||
|
|
58ee45b702 | ||
|
|
c62d97f23a | ||
|
|
d618c5ea12 | ||
|
|
d8e27f0d33 | ||
|
|
38496ff646 | ||
|
|
da1084907e | ||
|
|
3385b630e7 | ||
|
|
fc59183045 | ||
|
|
33242079f7 | ||
|
|
086148819c | ||
|
|
5df9fd881c | ||
|
|
bd17d36e7f | ||
|
|
be55fa22fd | ||
|
|
b48b3a5e2e | ||
|
|
fc03dd37f1 | ||
|
|
d8bb384689 | ||
|
|
0b32a10bb8 | ||
|
|
f0c027f54e | ||
|
|
b0f2f34d3b | ||
|
|
3e6b76df76 | ||
|
|
6197cf792d | ||
|
|
3c4e5a14f7 | ||
|
|
effc743b6e | ||
|
|
364a945d28 | ||
|
|
07b9354d18 | ||
|
|
8b1e537ca5 | ||
|
|
6a20e850bc | ||
|
|
636892bc9a | ||
|
|
b40f32ab57 | ||
|
|
14bab496b5 | ||
|
|
3cc367e0a3 | ||
|
|
36fc575e40 | ||
|
|
24efb34d91 | ||
|
|
c08e244c95 | ||
|
|
c2f8980f1f | ||
|
|
0ef85b3dee | ||
|
|
93a2431211 | ||
|
|
1fe74937c1 | ||
|
|
6ee016e577 | ||
|
|
f7248dfb1c | ||
|
|
856afb3966 | ||
|
|
bf315261af | ||
|
|
6e83afb580 | ||
|
|
1a5742d4f5 | ||
|
|
0e22458e86 | ||
|
|
cd8d1b8a8f | ||
|
|
141a142742 | ||
|
|
a59b344d20 | ||
|
|
f666711a2a | ||
|
|
1014d64828 | ||
|
|
a126a99853 | ||
|
|
082390a7f0 | ||
|
|
a994553c16 | ||
|
|
3fd2ae954d | ||
|
|
e17c5642ca | ||
|
|
fa7968cb1b | ||
|
|
57c3183b15 | ||
|
|
1fd6471cb1 | ||
|
|
1827230514 | ||
|
|
06dc3d3361 | ||
|
|
a7a2e24d42 | ||
|
|
bb543cb5db | ||
|
|
373ce0ad04 | ||
|
|
fcb979aae1 | ||
|
|
fcc56ad6f7 | ||
|
|
5be8570c8c | ||
|
|
d471442422 | ||
|
|
4070c923fc | ||
|
|
3ca38fe92d | ||
|
|
55ebadfe28 | ||
|
|
9bd2519c83 | ||
|
|
4bfe145be3 | ||
|
|
41085049e2 | ||
|
|
f7312db0c7 | ||
|
|
008534d839 | ||
|
|
8533714cb2 | ||
|
|
b822c19d2c | ||
|
|
2aa3126eb0 | ||
|
|
4c5e85f7ba | ||
|
|
2b41da4543 | ||
|
|
f8dc88df6e | ||
|
|
534033874e | ||
|
|
0851b923fd | ||
|
|
fd4bed65a0 | ||
|
|
4746b8b835 | ||
|
|
d24eafe6a6 | ||
|
|
f3b81edf67 | ||
|
|
976d0da26e | ||
|
|
5113b83bc4 | ||
|
|
a88877bf7c | ||
|
|
a46d7b2ed9 | ||
|
|
170241649d | ||
|
|
1ac22bddd6 | ||
|
|
54fe10ae86 | ||
|
|
33647786e6 | ||
|
|
eb3cb97115 | ||
|
|
236f57ab0e | ||
|
|
c88054107e | ||
|
|
c03c7c35d8 | ||
|
|
b5455215a5 | ||
|
|
85e12e9479 | ||
|
|
f3b7f841fb | ||
|
|
92547bfdb6 | ||
|
|
3739801ed4 | ||
|
|
a6778a6e27 | ||
|
|
f1fc3c63ea | ||
|
|
b2a80775a8 | ||
|
|
1f7f68f6af | ||
|
|
388678f822 | ||
|
|
1230a3323d | ||
|
|
02a3c750f8 | ||
|
|
cbdb9ce614 | ||
|
|
be98ea52d7 | ||
|
|
b6cf63bb0c | ||
|
|
04410033e7 | ||
|
|
e6c6df1334 | ||
|
|
91b06a4297 | ||
|
|
640ad7bd60 | ||
|
|
08b2ea01ab | ||
|
|
236dea9d26 | ||
|
|
f281f3791b | ||
|
|
aff2b80d55 | ||
|
|
e69949c336 | ||
|
|
5f7f36ecd4 | ||
|
|
9212478148 | ||
|
|
dec0ee1001 | ||
|
|
e610c2514d | ||
|
|
3955450245 | ||
|
|
49a437dc0d | ||
|
|
bf37be5013 | ||
|
|
9793de1e96 | ||
|
|
4c15318f28 | ||
|
|
a4d3e78eb1 | ||
|
|
436166c255 | ||
|
|
bbce2c5e35 | ||
|
|
0745a57f52 | ||
|
|
9974c84440 | ||
|
|
3c396e76f6 | ||
|
|
e701aca64b | ||
|
|
26ad482b90 | ||
|
|
d8fd3ef506 | ||
|
|
43016d75e8 | ||
|
|
39b6ce3352 | ||
|
|
1e3ec10a1a | ||
|
|
c4e13eef3f | ||
|
|
6558aedee3 | ||
|
|
a2dfb60466 | ||
|
|
c158dcf2ef | ||
|
|
40318b87bf | ||
|
|
64f06b11b8 | ||
|
|
583194085c | ||
|
|
2d89f57644 | ||
|
|
f4ed01444a | ||
|
|
a7980a202d | ||
|
|
3a6c93dd37 | ||
|
|
6cd272da37 | ||
|
|
a7056b66c7 | ||
|
|
4d6d58ef91 | ||
|
|
93a88ec2c7 | ||
|
|
b679df4fbe | ||
|
|
ba2c7347f9 | ||
|
|
f8b4e6e8f0 | ||
|
|
7ecb4d7b00 | ||
|
|
1697e6ad62 | ||
|
|
6687f76736 | ||
|
|
35e5bbdaf1 | ||
|
|
5c5e7d9509 | ||
|
|
b0c0a9d98c | ||
|
|
7c246f7be4 | ||
|
|
bfc2a41699 | ||
|
|
081a7ead4c | ||
|
|
70fbf1676a | ||
|
|
87ddb6b171 | ||
|
|
c0d45d730f | ||
|
|
6b97a04643 | ||
|
|
2a5a07bae0 | ||
|
|
18e34c670e | ||
|
|
d6a35485d2 | ||
|
|
6204f6cdc8 | ||
|
|
50bc5309f5 | ||
|
|
725e2e92ab | ||
|
|
0b07326e36 | ||
|
|
e86d194f11 | ||
|
|
6949656d0e | ||
|
|
a2c62bab47 | ||
|
|
3dd8aeac7c | ||
|
|
2c342a5c5f | ||
|
|
adef1afdfa | ||
|
|
a980b2606b | ||
|
|
ed83927486 | ||
|
|
e745885b09 | ||
|
|
16ddbfde9f | ||
|
|
bc11537350 | ||
|
|
ab4de79168 | ||
|
|
8134897e91 | ||
|
|
693d22ed25 | ||
|
|
b1dab2466f | ||
|
|
d2b09f39e7 | ||
|
|
4475801a96 | ||
|
|
126ff8cf0d | ||
|
|
a536a785de | ||
|
|
ed89ef74eb | ||
|
|
f1bea27e44 | ||
|
|
7305e53439 | ||
|
|
b08c0e8150 | ||
|
|
8606a4579a | ||
|
|
1dfb72a1d1 | ||
|
|
f09b55b893 | ||
|
|
30ba6029f5 | ||
|
|
9f0c830511 | ||
|
|
973e3138fe | ||
|
|
c996a562e6 | ||
|
|
f2bba4d1ee | ||
|
|
8017a95413 | ||
|
|
26d209daff | ||
|
|
44b979b4a4 | ||
|
|
03ad61abc6 | ||
|
|
fe425f89a4 | ||
|
|
11ad66fb79 | ||
|
|
ca5734a2c6 | ||
|
|
e5414e87c7 | ||
|
|
8142f8f62f | ||
|
|
74cf4076fa | ||
|
|
dbd29c0ce1 | ||
|
|
38a7dc1a93 | ||
|
|
2891bc0b96 | ||
|
|
8846ae6664 | ||
|
|
2e3c3a55aa | ||
|
|
7e44116d51 | ||
|
|
46f85e6395 | ||
|
|
94a384fd81 | ||
|
|
af6acefb53 | ||
|
|
94fd7d252f | ||
|
|
4767e38f5b | ||
|
|
276f6f9fb1 | ||
|
|
2386c71c4f | ||
|
|
21c52db66b | ||
|
|
13cfa02f80 | ||
|
|
eedfbe3e7a | ||
|
|
fe03eb4436 | ||
|
|
d8e45d5c3f | ||
|
|
12e9fb5eeb | ||
|
|
957ffaabae | ||
|
|
cb76e5a23c | ||
|
|
b17cc563ff | ||
|
|
06a0b12efb | ||
|
|
d5bd5ebb7d | ||
|
|
0a9a1c26db | ||
|
|
83bfd8a2d4 | ||
|
|
e5d2c0c700 | ||
|
|
590a5669d6 | ||
|
|
e042740f67 | ||
|
|
dab2ecaa6b | ||
|
|
f9f4133b48 | ||
|
|
33dd21897d | ||
|
|
cb2ef23a29 | ||
|
|
e70e01196f | ||
|
|
f70b9e6eb4 | ||
|
|
d186c69473 | ||
|
|
4d817c48a8 | ||
|
|
c13cab792b | ||
|
|
80aa463aa2 | ||
|
|
bd28b17ad9 | ||
|
|
223119e303 | ||
|
|
7c45cb45ae | ||
|
|
ac11c6729b | ||
|
|
1677654dea | ||
|
|
bc5a7a961b | ||
|
|
c10462223d | ||
|
|
54a9f412e8 | ||
|
|
5a107c58bb | ||
|
|
8f091e7548 | ||
|
|
8cdc7b18c7 | ||
|
|
9f2e87e9fb | ||
|
|
e119458048 | ||
|
|
c2983faf1d | ||
|
|
a09855207e | ||
|
|
1e1859ba6f | ||
|
|
a3937e48a8 | ||
|
|
d2aa53a2ec | ||
|
|
b0bdeea60f | ||
|
|
465e64b9ac | ||
|
|
fc53b28997 | ||
|
|
72e701a4b5 | ||
|
|
2298d5356d | ||
|
|
54137be92b | ||
|
|
7ffb12268d | ||
|
|
790fff460a | ||
|
|
9055dbafe3 | ||
|
|
4454d9115e | ||
|
|
0d74dec446 | ||
|
|
0313dba7b4 | ||
|
|
3fafac75ef | ||
|
|
6b24b46f3d | ||
|
|
474e39a4c9 | ||
|
|
e652298b6a |
2
.github/ISSUE_TEMPLATE/feature-request.yml
vendored
@@ -1,6 +1,6 @@
|
||||
name: 💡 Feature Request
|
||||
description: Suggest an idea for this project
|
||||
labels: ["enhancement", "status/needs-triage"]
|
||||
labels: ["feature-request", "status/needs-triage"]
|
||||
|
||||
|
||||
body:
|
||||
|
||||
28
.github/dependabot.yml
vendored
@@ -5,11 +5,35 @@
|
||||
|
||||
version: 2
|
||||
updates:
|
||||
- package-ecosystem: "pip" # See documentation for possible values
|
||||
directory: "/" # Location of package manifests
|
||||
- package-ecosystem: "pip"
|
||||
directory: "/"
|
||||
schedule:
|
||||
interval: "weekly"
|
||||
target-branch: master
|
||||
labels:
|
||||
- "dependencies"
|
||||
- "pip"
|
||||
- package-ecosystem: "github-actions"
|
||||
directory: "/"
|
||||
schedule:
|
||||
interval: "weekly"
|
||||
target-branch: master
|
||||
|
||||
- package-ecosystem: "pip"
|
||||
directory: "/"
|
||||
schedule:
|
||||
interval: "weekly"
|
||||
target-branch: v3
|
||||
labels:
|
||||
- "dependencies"
|
||||
- "pip"
|
||||
- "v3"
|
||||
- package-ecosystem: "github-actions"
|
||||
directory: "/"
|
||||
schedule:
|
||||
interval: "weekly"
|
||||
target-branch: v3
|
||||
labels:
|
||||
- "github_actions"
|
||||
- "v3"
|
||||
|
||||
|
||||
27
.github/labeler.yml
vendored
Normal file
@@ -0,0 +1,27 @@
|
||||
documentation:
|
||||
- changed-files:
|
||||
- any-glob-to-any-file: "docs/**"
|
||||
|
||||
provider/aws:
|
||||
- changed-files:
|
||||
- any-glob-to-any-file: "prowler/providers/aws/**"
|
||||
- any-glob-to-any-file: "tests/providers/aws/**"
|
||||
|
||||
provider/azure:
|
||||
- changed-files:
|
||||
- any-glob-to-any-file: "prowler/providers/azure/**"
|
||||
- any-glob-to-any-file: "tests/providers/azure/**"
|
||||
|
||||
provider/gcp:
|
||||
- changed-files:
|
||||
- any-glob-to-any-file: "prowler/providers/gcp/**"
|
||||
- any-glob-to-any-file: "tests/providers/gcp/**"
|
||||
|
||||
provider/kubernetes:
|
||||
- changed-files:
|
||||
- any-glob-to-any-file: "prowler/providers/kubernetes/**"
|
||||
- any-glob-to-any-file: "tests/providers/kubernetes/**"
|
||||
|
||||
github_actions:
|
||||
- changed-files:
|
||||
- any-glob-to-any-file: ".github/workflows/*"
|
||||
24
.github/workflows/build-documentation-on-pr.yml
vendored
Normal file
@@ -0,0 +1,24 @@
|
||||
name: Pull Request Documentation Link
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
branches:
|
||||
- 'master'
|
||||
- 'v3'
|
||||
paths:
|
||||
- 'docs/**'
|
||||
|
||||
env:
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
|
||||
jobs:
|
||||
documentation-link:
|
||||
name: Documentation Link
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Leave PR comment with the SaaS Documentation URI
|
||||
uses: peter-evans/create-or-update-comment@v4
|
||||
with:
|
||||
issue-number: ${{ env.PR_NUMBER }}
|
||||
body: |
|
||||
You can check the documentation for this PR here -> [SaaS Documentation](https://prowler-prowler-docs--${{ env.PR_NUMBER }}.com.readthedocs.build/projects/prowler-open-source/en/${{ env.PR_NUMBER }}/)
|
||||
107
.github/workflows/build-lint-push-containers.yml
vendored
@@ -3,6 +3,7 @@ name: build-lint-push-containers
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- "v3"
|
||||
- "master"
|
||||
paths-ignore:
|
||||
- ".github/**"
|
||||
@@ -13,52 +14,98 @@ on:
|
||||
types: [published]
|
||||
|
||||
env:
|
||||
# AWS Configuration
|
||||
AWS_REGION_STG: eu-west-1
|
||||
AWS_REGION_PLATFORM: eu-west-1
|
||||
AWS_REGION: us-east-1
|
||||
|
||||
# Container's configuration
|
||||
IMAGE_NAME: prowler
|
||||
DOCKERFILE_PATH: ./Dockerfile
|
||||
|
||||
# Tags
|
||||
LATEST_TAG: latest
|
||||
STABLE_TAG: stable
|
||||
TEMPORARY_TAG: temporary
|
||||
DOCKERFILE_PATH: ./Dockerfile
|
||||
PYTHON_VERSION: 3.9
|
||||
# The RELEASE_TAG is set during runtime in releases
|
||||
RELEASE_TAG: ""
|
||||
# The PROWLER_VERSION and PROWLER_VERSION_MAJOR are set during runtime in releases
|
||||
PROWLER_VERSION: ""
|
||||
PROWLER_VERSION_MAJOR: ""
|
||||
# TEMPORARY_TAG: temporary
|
||||
|
||||
# Python configuration
|
||||
PYTHON_VERSION: 3.12
|
||||
|
||||
jobs:
|
||||
# Build Prowler OSS container
|
||||
container-build-push:
|
||||
# needs: dockerfile-linter
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
prowler_version_major: ${{ steps.get-prowler-version.outputs.PROWLER_VERSION_MAJOR }}
|
||||
prowler_version: ${{ steps.update-prowler-version.outputs.PROWLER_VERSION }}
|
||||
env:
|
||||
POETRY_VIRTUALENVS_CREATE: "false"
|
||||
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup python (release)
|
||||
if: github.event_name == 'release'
|
||||
uses: actions/setup-python@v2
|
||||
- name: Setup Python
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
|
||||
- name: Install dependencies (release)
|
||||
if: github.event_name == 'release'
|
||||
- name: Install Poetry
|
||||
run: |
|
||||
pipx install poetry
|
||||
pipx inject poetry poetry-bumpversion
|
||||
|
||||
- name: Get Prowler version
|
||||
id: get-prowler-version
|
||||
run: |
|
||||
PROWLER_VERSION="$(poetry version -s 2>/dev/null)"
|
||||
|
||||
# Store prowler version major just for the release
|
||||
PROWLER_VERSION_MAJOR="${PROWLER_VERSION%%.*}"
|
||||
echo "PROWLER_VERSION_MAJOR=${PROWLER_VERSION_MAJOR}" >> "${GITHUB_ENV}"
|
||||
echo "PROWLER_VERSION_MAJOR=${PROWLER_VERSION_MAJOR}" >> "${GITHUB_OUTPUT}"
|
||||
|
||||
case ${PROWLER_VERSION_MAJOR} in
|
||||
3)
|
||||
echo "LATEST_TAG=v3-latest" >> "${GITHUB_ENV}"
|
||||
echo "STABLE_TAG=v3-stable" >> "${GITHUB_ENV}"
|
||||
;;
|
||||
|
||||
4)
|
||||
echo "LATEST_TAG=latest" >> "${GITHUB_ENV}"
|
||||
echo "STABLE_TAG=stable" >> "${GITHUB_ENV}"
|
||||
;;
|
||||
|
||||
*)
|
||||
# Fallback if any other version is present
|
||||
echo "Releasing another Prowler major version, aborting..."
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
|
||||
- name: Update Prowler version (release)
|
||||
id: update-prowler-version
|
||||
if: github.event_name == 'release'
|
||||
run: |
|
||||
poetry version ${{ github.event.release.tag_name }}
|
||||
|
||||
PROWLER_VERSION="${{ github.event.release.tag_name }}"
|
||||
poetry version "${PROWLER_VERSION}"
|
||||
echo "PROWLER_VERSION=${PROWLER_VERSION}" >> "${GITHUB_ENV}"
|
||||
echo "PROWLER_VERSION=${PROWLER_VERSION}" >> "${GITHUB_OUTPUT}"
|
||||
|
||||
- name: Login to DockerHub
|
||||
uses: docker/login-action@v2
|
||||
uses: docker/login-action@v3
|
||||
with:
|
||||
username: ${{ secrets.DOCKERHUB_USERNAME }}
|
||||
password: ${{ secrets.DOCKERHUB_TOKEN }}
|
||||
|
||||
- name: Login to Public ECR
|
||||
uses: docker/login-action@v2
|
||||
uses: docker/login-action@v3
|
||||
with:
|
||||
registry: public.ecr.aws
|
||||
username: ${{ secrets.PUBLIC_ECR_AWS_ACCESS_KEY_ID }}
|
||||
@@ -67,11 +114,11 @@ jobs:
|
||||
AWS_REGION: ${{ env.AWS_REGION }}
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v2
|
||||
uses: docker/setup-buildx-action@v3
|
||||
|
||||
- name: Build and push container image (latest)
|
||||
if: github.event_name == 'push'
|
||||
uses: docker/build-push-action@v2
|
||||
uses: docker/build-push-action@v5
|
||||
with:
|
||||
push: true
|
||||
tags: |
|
||||
@@ -83,16 +130,16 @@ jobs:
|
||||
|
||||
- name: Build and push container image (release)
|
||||
if: github.event_name == 'release'
|
||||
uses: docker/build-push-action@v2
|
||||
uses: docker/build-push-action@v5
|
||||
with:
|
||||
# Use local context to get changes
|
||||
# https://github.com/docker/build-push-action#path-context
|
||||
context: .
|
||||
push: true
|
||||
tags: |
|
||||
${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
|
||||
${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.PROWLER_VERSION }}
|
||||
${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
|
||||
${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
|
||||
${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.PROWLER_VERSION }}
|
||||
${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
|
||||
file: ${{ env.DOCKERFILE_PATH }}
|
||||
cache-from: type=gha
|
||||
@@ -102,16 +149,26 @@ jobs:
|
||||
needs: container-build-push
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Get latest commit info
|
||||
- name: Get latest commit info (latest)
|
||||
if: github.event_name == 'push'
|
||||
run: |
|
||||
LATEST_COMMIT_HASH=$(echo ${{ github.event.after }} | cut -b -7)
|
||||
echo "LATEST_COMMIT_HASH=${LATEST_COMMIT_HASH}" >> $GITHUB_ENV
|
||||
- name: Dispatch event for latest
|
||||
if: github.event_name == 'push'
|
||||
|
||||
- name: Dispatch event (latest)
|
||||
if: github.event_name == 'push' && needs.container-build-push.outputs.prowler_version_major == '3'
|
||||
run: |
|
||||
curl https://api.github.com/repos/${{ secrets.DISPATCH_OWNER }}/${{ secrets.DISPATCH_REPO }}/dispatches -H "Accept: application/vnd.github+json" -H "Authorization: Bearer ${{ secrets.ACCESS_TOKEN }}" -H "X-GitHub-Api-Version: 2022-11-28" --data '{"event_type":"dispatch","client_payload":{"version":"latest", "tag": "${{ env.LATEST_COMMIT_HASH }}"}}'
|
||||
- name: Dispatch event for release
|
||||
if: github.event_name == 'release'
|
||||
curl https://api.github.com/repos/${{ secrets.DISPATCH_OWNER }}/${{ secrets.DISPATCH_REPO }}/dispatches \
|
||||
-H "Accept: application/vnd.github+json" \
|
||||
-H "Authorization: Bearer ${{ secrets.ACCESS_TOKEN }}" \
|
||||
-H "X-GitHub-Api-Version: 2022-11-28" \
|
||||
--data '{"event_type":"dispatch","client_payload":{"version":"v3-latest", "tag": "${{ env.LATEST_COMMIT_HASH }}"}}'
|
||||
|
||||
- name: Dispatch event (release)
|
||||
if: github.event_name == 'release' && needs.container-build-push.outputs.prowler_version_major == '3'
|
||||
run: |
|
||||
curl https://api.github.com/repos/${{ secrets.DISPATCH_OWNER }}/${{ secrets.DISPATCH_REPO }}/dispatches -H "Accept: application/vnd.github+json" -H "Authorization: Bearer ${{ secrets.ACCESS_TOKEN }}" -H "X-GitHub-Api-Version: 2022-11-28" --data '{"event_type":"dispatch","client_payload":{"version":"release", "tag":"${{ github.event.release.tag_name }}"}}'
|
||||
curl https://api.github.com/repos/${{ secrets.DISPATCH_OWNER }}/${{ secrets.DISPATCH_REPO }}/dispatches \
|
||||
-H "Accept: application/vnd.github+json" \
|
||||
-H "Authorization: Bearer ${{ secrets.ACCESS_TOKEN }}" \
|
||||
-H "X-GitHub-Api-Version: 2022-11-28" \
|
||||
--data '{"event_type":"dispatch","client_payload":{"version":"release", "tag":"${{ needs.container-build-push.outputs.prowler_version }}"}}'
|
||||
|
||||
10
.github/workflows/codeql.yml
vendored
@@ -13,10 +13,10 @@ name: "CodeQL"
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ "master", prowler-2, prowler-3.0-dev ]
|
||||
branches: [ "master", "v3" ]
|
||||
pull_request:
|
||||
# The branches below must be a subset of the branches above
|
||||
branches: [ "master" ]
|
||||
branches: [ "master", "v3" ]
|
||||
schedule:
|
||||
- cron: '00 12 * * *'
|
||||
|
||||
@@ -37,11 +37,11 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v3
|
||||
uses: actions/checkout@v4
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@v2
|
||||
uses: github/codeql-action/init@v3
|
||||
with:
|
||||
languages: ${{ matrix.language }}
|
||||
# If you wish to specify custom queries, you can do so here or in a config file.
|
||||
@@ -52,6 +52,6 @@ jobs:
|
||||
# queries: security-extended,security-and-quality
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@v2
|
||||
uses: github/codeql-action/analyze@v3
|
||||
with:
|
||||
category: "/language:${{matrix.language}}"
|
||||
|
||||
4
.github/workflows/find-secrets.yml
vendored
@@ -7,11 +7,11 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
- name: TruffleHog OSS
|
||||
uses: trufflesecurity/trufflehog@v3.4.4
|
||||
uses: trufflesecurity/trufflehog@v3.72.0
|
||||
with:
|
||||
path: ./
|
||||
base: ${{ github.event.repository.default_branch }}
|
||||
|
||||
16
.github/workflows/labeler.yml
vendored
Normal file
@@ -0,0 +1,16 @@
|
||||
name: "Pull Request Labeler"
|
||||
|
||||
on:
|
||||
pull_request_target:
|
||||
branches:
|
||||
- "master"
|
||||
- "v3"
|
||||
|
||||
jobs:
|
||||
labeler:
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: write
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/labeler@v5
|
||||
45
.github/workflows/pull-request.yml
vendored
@@ -4,29 +4,44 @@ on:
|
||||
push:
|
||||
branches:
|
||||
- "master"
|
||||
- "v3"
|
||||
pull_request:
|
||||
branches:
|
||||
- "master"
|
||||
|
||||
- "v3"
|
||||
jobs:
|
||||
build:
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ["3.9"]
|
||||
python-version: ["3.9", "3.10", "3.11", "3.12"]
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- uses: actions/checkout@v4
|
||||
- name: Test if changes are in not ignored paths
|
||||
id: are-non-ignored-files-changed
|
||||
uses: tj-actions/changed-files@v44
|
||||
with:
|
||||
files: ./**
|
||||
files_ignore: |
|
||||
.github/**
|
||||
README.md
|
||||
docs/**
|
||||
permissions/**
|
||||
mkdocs.yml
|
||||
- name: Install poetry
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
pipx install poetry
|
||||
python -m pip install --upgrade pip
|
||||
pipx install poetry
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
uses: actions/setup-python@v4
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
cache: 'poetry'
|
||||
cache: "poetry"
|
||||
- name: Install dependencies
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
run: |
|
||||
poetry install
|
||||
poetry run pip list
|
||||
@@ -36,29 +51,43 @@ jobs:
|
||||
) && curl -L -o /tmp/hadolint "https://github.com/hadolint/hadolint/releases/download/v${VERSION}/hadolint-Linux-x86_64" \
|
||||
&& chmod +x /tmp/hadolint
|
||||
- name: Poetry check
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
run: |
|
||||
poetry lock --check
|
||||
- name: Lint with flake8
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
run: |
|
||||
poetry run flake8 . --ignore=E266,W503,E203,E501,W605,E128 --exclude contrib
|
||||
- name: Checking format with black
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
run: |
|
||||
poetry run black --check .
|
||||
- name: Lint with pylint
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
run: |
|
||||
poetry run pylint --disable=W,C,R,E -j 0 -rn -sn prowler/
|
||||
- name: Bandit
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
run: |
|
||||
poetry run bandit -q -lll -x '*_test.py,./contrib/' -r .
|
||||
- name: Safety
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
run: |
|
||||
poetry run safety check
|
||||
- name: Vulture
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
run: |
|
||||
poetry run vulture --exclude "contrib" --min-confidence 100 .
|
||||
- name: Hadolint
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
run: |
|
||||
/tmp/hadolint Dockerfile --ignore=DL3013
|
||||
- name: Test with pytest
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
run: |
|
||||
poetry run pytest tests -n auto
|
||||
poetry run pytest -n auto --cov=./prowler --cov-report=xml tests
|
||||
- name: Upload coverage reports to Codecov
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
uses: codecov/codecov-action@v4
|
||||
env:
|
||||
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
|
||||
|
||||
97
.github/workflows/pypi-release.yml
vendored
@@ -6,7 +6,10 @@ on:
|
||||
|
||||
env:
|
||||
RELEASE_TAG: ${{ github.event.release.tag_name }}
|
||||
GITHUB_BRANCH: master
|
||||
PYTHON_VERSION: 3.11
|
||||
CACHE: "poetry"
|
||||
# TODO: create a bot user for this kind of tasks, like prowler-bot
|
||||
GIT_COMMITTER_EMAIL: "sergio@prowler.com"
|
||||
|
||||
jobs:
|
||||
release-prowler-job:
|
||||
@@ -15,65 +18,81 @@ jobs:
|
||||
POETRY_VIRTUALENVS_CREATE: "false"
|
||||
name: Release Prowler to PyPI
|
||||
steps:
|
||||
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
|
||||
- uses: actions/checkout@v3
|
||||
with:
|
||||
ref: ${{ env.GITHUB_BRANCH }}
|
||||
- name: Get Prowler version
|
||||
run: |
|
||||
PROWLER_VERSION="${{ env.RELEASE_TAG }}"
|
||||
|
||||
case ${PROWLER_VERSION%%.*} in
|
||||
3)
|
||||
echo "Releasing Prowler v3 with tag ${PROWLER_VERSION}"
|
||||
;;
|
||||
4)
|
||||
echo "Releasing Prowler v4 with tag ${PROWLER_VERSION}"
|
||||
;;
|
||||
*)
|
||||
echo "Releasing another Prowler major version, aborting..."
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
pipx install poetry
|
||||
pipx inject poetry poetry-bumpversion
|
||||
- name: setup python
|
||||
uses: actions/setup-python@v4
|
||||
|
||||
- name: Setup Python
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: 3.9
|
||||
cache: 'poetry'
|
||||
- name: Change version and Build package
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
cache: ${{ env.CACHE }}
|
||||
|
||||
- name: Update Poetry and config version
|
||||
run: |
|
||||
poetry version ${{ env.RELEASE_TAG }}
|
||||
|
||||
- name: Import GPG key
|
||||
uses: crazy-max/ghaction-import-gpg@v6
|
||||
with:
|
||||
gpg_private_key: ${{ secrets.GPG_PRIVATE_KEY }}
|
||||
passphrase: ${{ secrets.GPG_PASSPHRASE }}
|
||||
git_user_signingkey: true
|
||||
git_commit_gpgsign: true
|
||||
|
||||
- name: Push updated version to the release tag
|
||||
run: |
|
||||
# Configure Git
|
||||
git config user.name "github-actions"
|
||||
git config user.email "<noreply@github.com>"
|
||||
git config user.email "${{ env.GIT_COMMITTER_EMAIL }}"
|
||||
|
||||
# Add the files with the version changed
|
||||
git add prowler/config/config.py pyproject.toml
|
||||
git commit -m "chore(release): ${{ env.RELEASE_TAG }}" --no-verify
|
||||
git tag -fa ${{ env.RELEASE_TAG }} -m "chore(release): ${{ env.RELEASE_TAG }}"
|
||||
git commit -m "chore(release): ${{ env.RELEASE_TAG }}" --no-verify -S
|
||||
|
||||
# Replace the tag with the version updated
|
||||
git tag -fa ${{ env.RELEASE_TAG }} -m "chore(release): ${{ env.RELEASE_TAG }}" --sign
|
||||
|
||||
# Push the tag
|
||||
git push -f origin ${{ env.RELEASE_TAG }}
|
||||
|
||||
- name: Build Prowler package
|
||||
run: |
|
||||
poetry build
|
||||
- name: Publish prowler package to PyPI
|
||||
|
||||
- name: Publish Prowler package to PyPI
|
||||
run: |
|
||||
poetry config pypi-token.pypi ${{ secrets.PYPI_API_TOKEN }}
|
||||
poetry publish
|
||||
# Create pull request with new version
|
||||
- name: Create Pull Request
|
||||
uses: peter-evans/create-pull-request@v4
|
||||
with:
|
||||
token: ${{ secrets.PROWLER_ACCESS_TOKEN }}
|
||||
commit-message: "chore(release): update Prowler Version to ${{ env.RELEASE_TAG }}."
|
||||
branch: release-${{ env.RELEASE_TAG }}
|
||||
labels: "status/waiting-for-revision, severity/low"
|
||||
title: "chore(release): update Prowler Version to ${{ env.RELEASE_TAG }}"
|
||||
body: |
|
||||
### Description
|
||||
|
||||
This PR updates Prowler Version to ${{ env.RELEASE_TAG }}.
|
||||
|
||||
### License
|
||||
|
||||
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
|
||||
- name: Replicate PyPi Package
|
||||
- name: Replicate PyPI package
|
||||
run: |
|
||||
rm -rf ./dist && rm -rf ./build && rm -rf prowler.egg-info
|
||||
pip install toml
|
||||
python util/replicate_pypi_package.py
|
||||
poetry build
|
||||
|
||||
- name: Publish prowler-cloud package to PyPI
|
||||
run: |
|
||||
poetry config pypi-token.pypi ${{ secrets.PYPI_API_TOKEN }}
|
||||
poetry publish
|
||||
# Create pull request to github.com/Homebrew/homebrew-core to update prowler formula
|
||||
- name: Bump Homebrew formula
|
||||
uses: mislav/bump-homebrew-formula-action@v2
|
||||
with:
|
||||
formula-name: prowler
|
||||
base-branch: release-${{ env.RELEASE_TAG }}
|
||||
env:
|
||||
COMMITTER_TOKEN: ${{ secrets.PROWLER_ACCESS_TOKEN }}
|
||||
|
||||
@@ -23,12 +23,12 @@ jobs:
|
||||
# Steps represent a sequence of tasks that will be executed as part of the job
|
||||
steps:
|
||||
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
|
||||
- uses: actions/checkout@v3
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
ref: ${{ env.GITHUB_BRANCH }}
|
||||
|
||||
- name: setup python
|
||||
uses: actions/setup-python@v2
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: 3.9 #install the python needed
|
||||
|
||||
@@ -38,7 +38,7 @@ jobs:
|
||||
pip install boto3
|
||||
|
||||
- name: Configure AWS Credentials -- DEV
|
||||
uses: aws-actions/configure-aws-credentials@v1
|
||||
uses: aws-actions/configure-aws-credentials@v4
|
||||
with:
|
||||
aws-region: ${{ env.AWS_REGION_DEV }}
|
||||
role-to-assume: ${{ secrets.DEV_IAM_ROLE_ARN }}
|
||||
@@ -50,12 +50,12 @@ jobs:
|
||||
|
||||
# Create pull request
|
||||
- name: Create Pull Request
|
||||
uses: peter-evans/create-pull-request@v4
|
||||
uses: peter-evans/create-pull-request@v6
|
||||
with:
|
||||
token: ${{ secrets.PROWLER_ACCESS_TOKEN }}
|
||||
commit-message: "feat(regions_update): Update regions for AWS services."
|
||||
branch: "aws-services-regions-updated-${{ github.sha }}"
|
||||
labels: "status/waiting-for-revision, severity/low"
|
||||
labels: "status/waiting-for-revision, severity/low, provider/aws, backport-v3"
|
||||
title: "chore(regions_update): Changes in regions for AWS services."
|
||||
body: |
|
||||
### Description
|
||||
|
||||
11
.gitignore
vendored
@@ -9,8 +9,9 @@
|
||||
__pycache__
|
||||
venv/
|
||||
build/
|
||||
dist/
|
||||
/dist/
|
||||
*.egg-info/
|
||||
*/__pycache__/*.pyc
|
||||
|
||||
# Session
|
||||
Session.vim
|
||||
@@ -46,3 +47,11 @@ junit-reports/
|
||||
|
||||
# .env
|
||||
.env*
|
||||
|
||||
# Coverage
|
||||
.coverage*
|
||||
.coverage
|
||||
coverage*
|
||||
|
||||
# Node
|
||||
node_modules
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
repos:
|
||||
## GENERAL
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: v4.4.0
|
||||
rev: v4.5.0
|
||||
hooks:
|
||||
- id: check-merge-conflict
|
||||
- id: check-yaml
|
||||
@@ -15,7 +15,7 @@ repos:
|
||||
|
||||
## TOML
|
||||
- repo: https://github.com/macisamuele/language-formatters-pre-commit-hooks
|
||||
rev: v2.7.0
|
||||
rev: v2.12.0
|
||||
hooks:
|
||||
- id: pretty-format-toml
|
||||
args: [--autofix]
|
||||
@@ -26,9 +26,10 @@ repos:
|
||||
rev: v0.9.0
|
||||
hooks:
|
||||
- id: shellcheck
|
||||
exclude: contrib
|
||||
## PYTHON
|
||||
- repo: https://github.com/myint/autoflake
|
||||
rev: v2.0.1
|
||||
rev: v2.2.1
|
||||
hooks:
|
||||
- id: autoflake
|
||||
args:
|
||||
@@ -39,25 +40,25 @@ repos:
|
||||
]
|
||||
|
||||
- repo: https://github.com/timothycrosley/isort
|
||||
rev: 5.12.0
|
||||
rev: 5.13.2
|
||||
hooks:
|
||||
- id: isort
|
||||
args: ["--profile", "black"]
|
||||
|
||||
- repo: https://github.com/psf/black
|
||||
rev: 23.1.0
|
||||
rev: 24.1.1
|
||||
hooks:
|
||||
- id: black
|
||||
|
||||
- repo: https://github.com/pycqa/flake8
|
||||
rev: 6.0.0
|
||||
rev: 7.0.0
|
||||
hooks:
|
||||
- id: flake8
|
||||
exclude: contrib
|
||||
args: ["--ignore=E266,W503,E203,E501,W605"]
|
||||
|
||||
- repo: https://github.com/python-poetry/poetry
|
||||
rev: 1.5.1 # add version here
|
||||
rev: 1.7.0
|
||||
hooks:
|
||||
- id: poetry-check
|
||||
- id: poetry-lock
|
||||
@@ -75,26 +76,23 @@ repos:
|
||||
name: pylint
|
||||
entry: bash -c 'pylint --disable=W,C,R,E -j 0 -rn -sn prowler/'
|
||||
language: system
|
||||
files: '.*\.py'
|
||||
|
||||
- id: trufflehog
|
||||
name: TruffleHog
|
||||
description: Detect secrets in your data.
|
||||
# entry: bash -c 'trufflehog git file://. --only-verified --fail'
|
||||
entry: bash -c 'trufflehog --no-update git file://. --only-verified --fail'
|
||||
# For running trufflehog in docker, use the following entry instead:
|
||||
entry: bash -c 'docker run -v "$(pwd):/workdir" -i --rm trufflesecurity/trufflehog:latest git file:///workdir --only-verified --fail'
|
||||
# entry: bash -c 'docker run -v "$(pwd):/workdir" -i --rm trufflesecurity/trufflehog:latest git file:///workdir --only-verified --fail'
|
||||
language: system
|
||||
stages: ["commit", "push"]
|
||||
|
||||
- id: pytest-check
|
||||
name: pytest-check
|
||||
entry: bash -c 'pytest tests -n auto'
|
||||
language: system
|
||||
|
||||
- id: bandit
|
||||
name: bandit
|
||||
description: "Bandit is a tool for finding common security issues in Python code"
|
||||
entry: bash -c 'bandit -q -lll -x '*_test.py,./contrib/' -r .'
|
||||
language: system
|
||||
files: '.*\.py'
|
||||
|
||||
- id: safety
|
||||
name: safety
|
||||
@@ -107,3 +105,4 @@ repos:
|
||||
description: "Vulture finds unused code in Python programs."
|
||||
entry: bash -c 'vulture --exclude "contrib" --min-confidence 100 .'
|
||||
language: system
|
||||
files: '.*\.py'
|
||||
|
||||
@@ -8,16 +8,18 @@ version: 2
|
||||
build:
|
||||
os: "ubuntu-22.04"
|
||||
tools:
|
||||
python: "3.9"
|
||||
python: "3.11"
|
||||
jobs:
|
||||
post_create_environment:
|
||||
# Install poetry
|
||||
# https://python-poetry.org/docs/#installing-manually
|
||||
- pip install poetry
|
||||
# Tell poetry to not use a virtual environment
|
||||
- poetry config virtualenvs.create false
|
||||
- python -m pip install poetry
|
||||
post_install:
|
||||
- poetry install -E docs
|
||||
# Install dependencies with 'docs' dependency group
|
||||
# https://python-poetry.org/docs/managing-dependencies/#dependency-groups
|
||||
# VIRTUAL_ENV needs to be set manually for now.
|
||||
# See https://github.com/readthedocs/readthedocs.org/pull/11152/
|
||||
- VIRTUAL_ENV=${READTHEDOCS_VIRTUALENV_PATH} python -m poetry install --only=docs
|
||||
|
||||
mkdocs:
|
||||
configuration: mkdocs.yml
|
||||
|
||||
@@ -55,7 +55,7 @@ further defined and clarified by project maintainers.
|
||||
## Enforcement
|
||||
|
||||
Instances of abusive, harassing, or otherwise unacceptable behavior may be
|
||||
reported by contacting the project team at community@prowler.cloud. All
|
||||
reported by contacting the project team at [support.prowler.com](https://customer.support.prowler.com/servicedesk/customer/portals). All
|
||||
complaints will be reviewed and investigated and will result in a response that
|
||||
is deemed necessary and appropriate to the circumstances. The project team is
|
||||
obligated to maintain confidentiality with regard to the reporter of an incident.
|
||||
|
||||
12
Dockerfile
@@ -1,9 +1,10 @@
|
||||
FROM python:3.9-alpine
|
||||
FROM python:3.12-alpine
|
||||
|
||||
LABEL maintainer="https://github.com/prowler-cloud/prowler"
|
||||
|
||||
# Update system dependencies
|
||||
RUN apk --no-cache upgrade
|
||||
#hadolint ignore=DL3018
|
||||
RUN apk --no-cache upgrade && apk --no-cache add curl
|
||||
|
||||
# Create nonroot user
|
||||
RUN mkdir -p /home/prowler && \
|
||||
@@ -14,7 +15,8 @@ USER prowler
|
||||
|
||||
# Copy necessary files
|
||||
WORKDIR /home/prowler
|
||||
COPY prowler/ /home/prowler/prowler/
|
||||
COPY prowler/ /home/prowler/prowler/
|
||||
COPY dashboard/ /home/prowler/dashboard/
|
||||
COPY pyproject.toml /home/prowler
|
||||
COPY README.md /home/prowler
|
||||
|
||||
@@ -25,6 +27,10 @@ ENV PATH="$HOME/.local/bin:$PATH"
|
||||
RUN pip install --no-cache-dir --upgrade pip && \
|
||||
pip install --no-cache-dir .
|
||||
|
||||
# Remove deprecated dash dependencies
|
||||
RUN pip uninstall dash-html-components -y && \
|
||||
pip uninstall dash-core-components -y
|
||||
|
||||
# Remove Prowler directory and build files
|
||||
USER 0
|
||||
RUN rm -rf /home/prowler/prowler /home/prowler/pyproject.toml /home/prowler/README.md /home/prowler/build /home/prowler/prowler.egg-info
|
||||
|
||||
2
LICENSE
@@ -186,7 +186,7 @@
|
||||
same "printed page" as the copyright notice for easier
|
||||
identification within third-party archives.
|
||||
|
||||
Copyright 2018 Netflix, Inc.
|
||||
Copyright @ 2024 Toni de la Fuente
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
|
||||
13
Makefile
@@ -2,12 +2,19 @@
|
||||
|
||||
##@ Testing
|
||||
test: ## Test with pytest
|
||||
pytest -n auto -vvv -s -x
|
||||
rm -rf .coverage && \
|
||||
pytest -n auto -vvv -s --cov=./prowler --cov-report=xml tests
|
||||
|
||||
coverage: ## Show Test Coverage
|
||||
coverage run --skip-covered -m pytest -v && \
|
||||
coverage report -m && \
|
||||
rm -rf .coverage
|
||||
rm -rf .coverage && \
|
||||
coverage report -m
|
||||
|
||||
coverage-html: ## Show Test Coverage
|
||||
rm -rf ./htmlcov && \
|
||||
coverage html && \
|
||||
open htmlcov/index.html
|
||||
|
||||
##@ Linting
|
||||
format: ## Format Code
|
||||
@@ -20,7 +27,7 @@ lint: ## Lint Code
|
||||
@echo "Running black... "
|
||||
black --check .
|
||||
@echo "Running pylint..."
|
||||
pylint --disable=W,C,R,E -j 0 providers lib util config
|
||||
pylint --disable=W,C,R,E -j 0 prowler util
|
||||
|
||||
##@ PyPI
|
||||
pypi-clean: ## Delete the distribution files
|
||||
|
||||
71
README.md
@@ -1,24 +1,31 @@
|
||||
<p align="center">
|
||||
<img align="center" src="https://github.com/prowler-cloud/prowler/blob/62c1ce73bbcdd6b9e5ba03dfcae26dfd165defd9/docs/img/prowler-pro-dark.png?raw=True#gh-dark-mode-only" width="150" height="36">
|
||||
<img align="center" src="https://github.com/prowler-cloud/prowler/blob/62c1ce73bbcdd6b9e5ba03dfcae26dfd165defd9/docs/img/prowler-pro-light.png?raw=True#gh-light-mode-only" width="15%" height="15%">
|
||||
<img align="center" src="https://github.com/prowler-cloud/prowler/blob/master/docs/img/prowler-logo-black.png?raw=True#gh-light-mode-only" width="350" height="115">
|
||||
<img align="center" src="https://github.com/prowler-cloud/prowler/blob/master/docs/img/prowler-logo-white.png?raw=True#gh-dark-mode-only" width="350" height="115">
|
||||
</p>
|
||||
<p align="center">
|
||||
<b><i>See all the things you and your team can do with ProwlerPro at <a href="https://prowler.pro">prowler.pro</a></i></b>
|
||||
<b><i>Prowler SaaS </b> and <b>Prowler Open Source</b> are as dynamic and adaptable as the environment they’re meant to protect. Trusted by the leaders in security.
|
||||
</p>
|
||||
<p align="center">
|
||||
<b>Learn more at <a href="https://prowler.com">prowler.com</i></b>
|
||||
</p>
|
||||
|
||||
<p align="center">
|
||||
<a href="https://join.slack.com/t/prowler-workspace/shared_invite/zt-1hix76xsl-2uq222JIXrC7Q8It~9ZNog"><img width="30" height="30" alt="Prowler community on Slack" src="https://github.com/prowler-cloud/prowler/assets/3985464/3617e470-670c-47c9-9794-ce895ebdb627"></a>
|
||||
<br>
|
||||
<a href="https://join.slack.com/t/prowler-workspace/shared_invite/zt-1hix76xsl-2uq222JIXrC7Q8It~9ZNog">Join our Prowler community!</a>
|
||||
</p>
|
||||
|
||||
<hr>
|
||||
<p align="center">
|
||||
<img src="https://user-images.githubusercontent.com/3985464/113734260-7ba06900-96fb-11eb-82bc-d4f68a1e2710.png" />
|
||||
</p>
|
||||
<p align="center">
|
||||
<a href="https://join.slack.com/t/prowler-workspace/shared_invite/zt-1hix76xsl-2uq222JIXrC7Q8It~9ZNog"><img alt="Slack Shield" src="https://img.shields.io/badge/slack-prowler-brightgreen.svg?logo=slack"></a>
|
||||
<a href="https://pypi.org/project/prowler/"><img alt="Python Version" src="https://img.shields.io/pypi/v/prowler.svg"></a>
|
||||
<a href="https://pypi.python.org/pypi/prowler/"><img alt="Python Version" src="https://img.shields.io/pypi/pyversions/prowler.svg"></a>
|
||||
<a href="https://pypistats.org/packages/prowler"><img alt="PyPI Prowler Downloads" src="https://img.shields.io/pypi/dw/prowler.svg?label=prowler%20downloads"></a>
|
||||
<a href="https://pypistats.org/packages/prowler-cloud"><img alt="PyPI Prowler-Cloud Downloads" src="https://img.shields.io/pypi/dw/prowler-cloud.svg?label=prowler-cloud%20downloads"></a>
|
||||
<a href="https://hub.docker.com/r/toniblyx/prowler"><img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/toniblyx/prowler"></a>
|
||||
<a href="https://hub.docker.com/r/toniblyx/prowler"><img alt="Docker" src="https://img.shields.io/docker/cloud/build/toniblyx/prowler"></a>
|
||||
<a href="https://hub.docker.com/r/toniblyx/prowler"><img alt="Docker" src="https://img.shields.io/docker/image-size/toniblyx/prowler"></a>
|
||||
<a href="https://gallery.ecr.aws/prowler-cloud/prowler"><img width="120" height=19" alt="AWS ECR Gallery" src="https://user-images.githubusercontent.com/3985464/151531396-b6535a68-c907-44eb-95a1-a09508178616.png"></a>
|
||||
<a href="https://codecov.io/gh/prowler-cloud/prowler"><img src="https://codecov.io/gh/prowler-cloud/prowler/graph/badge.svg?token=OflBGsdpDl"/></a>
|
||||
</p>
|
||||
<p align="center">
|
||||
<a href="https://github.com/prowler-cloud/prowler"><img alt="Repo size" src="https://img.shields.io/github/repo-size/prowler-cloud/prowler"></a>
|
||||
@@ -30,6 +37,7 @@
|
||||
<a href="https://twitter.com/ToniBlyx"><img alt="Twitter" src="https://img.shields.io/twitter/follow/toniblyx?style=social"></a>
|
||||
<a href="https://twitter.com/prowlercloud"><img alt="Twitter" src="https://img.shields.io/twitter/follow/prowlercloud?style=social"></a>
|
||||
</p>
|
||||
<hr>
|
||||
|
||||
# Description
|
||||
|
||||
@@ -37,16 +45,16 @@
|
||||
|
||||
It contains hundreds of controls covering CIS, NIST 800, NIST CSF, CISA, RBI, FedRAMP, PCI-DSS, GDPR, HIPAA, FFIEC, SOC2, GXP, AWS Well-Architected Framework Security Pillar, AWS Foundational Technical Review (FTR), ENS (Spanish National Security Scheme) and your custom security frameworks.
|
||||
|
||||
| Provider | Checks | Services | [Compliance Frameworks](https://docs.prowler.cloud/en/latest/tutorials/compliance/) | [Categories](https://docs.prowler.cloud/en/latest/tutorials/misc/#categories) |
|
||||
| Provider | Checks | Services | [Compliance Frameworks](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/compliance/) | [Categories](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/misc/#categories) |
|
||||
|---|---|---|---|---|
|
||||
| AWS | 287 | 56 -> `prowler aws --list-services` | 25 -> `prowler aws --list-compliance` | 5 -> `prowler aws --list-categories` |
|
||||
| GCP | 73 | 11 -> `prowler gcp --list-services` | 1 -> `prowler gcp --list-compliance` | 2 -> `prowler gcp --list-categories`|
|
||||
| Azure | 23 | 4 -> `prowler azure --list-services` | CIS soon | 1 -> `prowler azure --list-categories` |
|
||||
| Kubernetes | Planned | - | - | - |
|
||||
| AWS | 304 | 61 -> `prowler aws --list-services` | 28 -> `prowler aws --list-compliance` | 6 -> `prowler aws --list-categories` |
|
||||
| GCP | 75 | 11 -> `prowler gcp --list-services` | 1 -> `prowler gcp --list-compliance` | 2 -> `prowler gcp --list-categories`|
|
||||
| Azure | 127 | 16 -> `prowler azure --list-services` | 2 -> `prowler azure --list-compliance` | 2 -> `prowler azure --list-categories` |
|
||||
| Kubernetes | 83 | 7 -> `prowler kubernetes --list-services` | 1 -> `prowler kubernetes --list-compliance` | 7 -> `prowler kubernetes --list-categories` |
|
||||
|
||||
# 📖 Documentation
|
||||
|
||||
The full documentation can now be found at [https://docs.prowler.cloud](https://docs.prowler.cloud)
|
||||
The full documentation can now be found at [https://docs.prowler.com](https://docs.prowler.com/projects/prowler-open-source/en/latest/)
|
||||
|
||||
## Looking for Prowler v2 documentation?
|
||||
For Prowler v2 Documentation, please go to https://github.com/prowler-cloud/prowler/tree/2.12.1.
|
||||
@@ -54,13 +62,13 @@ For Prowler v2 Documentation, please go to https://github.com/prowler-cloud/prow
|
||||
# ⚙️ Install
|
||||
|
||||
## Pip package
|
||||
Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-cloud/), thus can be installed using pip with Python >= 3.9:
|
||||
Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-cloud/), thus can be installed using pip with Python >= 3.9, < 3.13:
|
||||
|
||||
```console
|
||||
pip install prowler
|
||||
prowler -v
|
||||
```
|
||||
More details at https://docs.prowler.cloud
|
||||
More details at [https://docs.prowler.com](https://docs.prowler.com/projects/prowler-open-source/en/latest/)
|
||||
|
||||
## Containers
|
||||
|
||||
@@ -77,7 +85,7 @@ The container images are available here:
|
||||
|
||||
## From Github
|
||||
|
||||
Python >= 3.9 is required with pip and poetry:
|
||||
Python >= 3.9, < 3.13 is required with pip and poetry:
|
||||
|
||||
```
|
||||
git clone https://github.com/prowler-cloud/prowler
|
||||
@@ -91,7 +99,7 @@ python prowler.py -v
|
||||
|
||||
You can run Prowler from your workstation, an EC2 instance, Fargate or any other container, Codebuild, CloudShell and Cloud9.
|
||||
|
||||

|
||||

|
||||
|
||||
# 📝 Requirements
|
||||
|
||||
@@ -115,8 +123,8 @@ Make sure you have properly configured your AWS-CLI with a valid Access Key and
|
||||
|
||||
Those credentials must be associated to a user or role with proper permissions to do all checks. To make sure, add the following AWS managed policies to the user or role being used:
|
||||
|
||||
- arn:aws:iam::aws:policy/SecurityAudit
|
||||
- arn:aws:iam::aws:policy/job-function/ViewOnlyAccess
|
||||
- `arn:aws:iam::aws:policy/SecurityAudit`
|
||||
- `arn:aws:iam::aws:policy/job-function/ViewOnlyAccess`
|
||||
|
||||
> Moreover, some read-only additional permissions are needed for several checks, make sure you attach also the custom policy [prowler-additions-policy.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-additions-policy.json) to the role you are using.
|
||||
|
||||
@@ -178,11 +186,7 @@ Prowler will follow the same credentials search as [Google authentication librar
|
||||
2. [User credentials set up by using the Google Cloud CLI](https://cloud.google.com/docs/authentication/application-default-credentials#personal)
|
||||
3. [The attached service account, returned by the metadata server](https://cloud.google.com/docs/authentication/application-default-credentials#attached-sa)
|
||||
|
||||
Those credentials must be associated to a user or service account with proper permissions to do all checks. To make sure, add the following roles to the member associated with the credentials:
|
||||
|
||||
- Viewer
|
||||
- Security Reviewer
|
||||
- Stackdriver Account Viewer
|
||||
Those credentials must be associated to a user or service account with proper permissions to do all checks. To make sure, add the `Viewer` role to the member associated with the credentials.
|
||||
|
||||
> By default, `prowler` will scan all accessible GCP Projects, use flag `--project-ids` to specify the projects to be scanned.
|
||||
|
||||
@@ -269,6 +273,25 @@ prowler gcp --credentials-file path
|
||||
```
|
||||
> By default, `prowler` will scan all accessible GCP Projects, use flag `--project-ids` to specify the projects to be scanned.
|
||||
|
||||
## Kubernetes
|
||||
|
||||
For non in-cluster execution, you can provide the location of the KubeConfig file with the following argument:
|
||||
|
||||
```console
|
||||
prowler kubernetes --kubeconfig-file path
|
||||
```
|
||||
|
||||
For in-cluster execution, you can use the supplied yaml to run Prowler as a job:
|
||||
```console
|
||||
kubectl apply -f job.yaml
|
||||
kubectl apply -f prowler-role.yaml
|
||||
kubectl apply -f prowler-rolebinding.yaml
|
||||
kubectl get pods --> prowler-XXXXX
|
||||
kubectl logs prowler-XXXXX
|
||||
```
|
||||
|
||||
> By default, `prowler` will scan all namespaces in your active Kubernetes context, use flag `--context` to specify the context to be scanned and `--namespaces` to specify the namespaces to be scanned.
|
||||
|
||||
# 📃 License
|
||||
|
||||
Prowler is licensed as Apache License 2.0 as specified in each file. You may obtain a copy of the License at
|
||||
|
||||
@@ -14,7 +14,7 @@ As an **AWS Partner** and we have passed the [AWS Foundation Technical Review (F
|
||||
|
||||
If you would like to report a vulnerability or have a security concern regarding Prowler Open Source or ProwlerPro service, please submit the information by contacting to help@prowler.pro.
|
||||
|
||||
The information you share with Verica as part of this process is kept confidential within Verica and the Prowler team. We will only share this information with a third party if the vulnerability you report is found to affect a third-party product, in which case we will share this information with the third-party product's author or manufacturer. Otherwise, we will only share this information as permitted by you.
|
||||
The information you share with ProwlerPro as part of this process is kept confidential within ProwlerPro. We will only share this information with a third party if the vulnerability you report is found to affect a third-party product, in which case we will share this information with the third-party product's author or manufacturer. Otherwise, we will only share this information as permitted by you.
|
||||
|
||||
We will review the submitted report, and assign it a tracking number. We will then respond to you, acknowledging receipt of the report, and outline the next steps in the process.
|
||||
|
||||
|
||||
@@ -1,17 +1,8 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Install system dependencies
|
||||
sudo yum -y install openssl-devel bzip2-devel libffi-devel gcc
|
||||
# Upgrade to Python 3.9
|
||||
cd /tmp && wget https://www.python.org/ftp/python/3.9.13/Python-3.9.13.tgz
|
||||
tar zxf Python-3.9.13.tgz
|
||||
cd Python-3.9.13/ || exit
|
||||
./configure --enable-optimizations
|
||||
sudo make altinstall
|
||||
python3.9 --version
|
||||
# Install Prowler
|
||||
cd ~ || exit
|
||||
python3.9 -m pip install prowler-cloud
|
||||
prowler -v
|
||||
# Run Prowler
|
||||
prowler
|
||||
sudo bash
|
||||
adduser prowler
|
||||
su prowler
|
||||
pip install prowler
|
||||
cd /tmp || exit
|
||||
prowler aws
|
||||
|
||||
176
dashboard/__main__.py
Normal file
@@ -0,0 +1,176 @@
|
||||
# Importing Packages
|
||||
import sys
|
||||
import warnings
|
||||
|
||||
import click
|
||||
import dash
|
||||
import dash_bootstrap_components as dbc
|
||||
from colorama import Fore, Style
|
||||
from dash import dcc, html
|
||||
from dash.dependencies import Input, Output
|
||||
|
||||
from dashboard.config import folder_path_overview
|
||||
from prowler.config.config import orange_color
|
||||
from prowler.lib.banner import print_banner
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
cli = sys.modules["flask.cli"]
|
||||
print_banner(verbose=False)
|
||||
print(
|
||||
f"{Fore.GREEN}Loading all CSV files from the folder {folder_path_overview} ...\n{Style.RESET_ALL}"
|
||||
)
|
||||
cli.show_server_banner = lambda *x: click.echo(
|
||||
f"{Fore.YELLOW}NOTE:{Style.RESET_ALL} If you are a {Fore.GREEN}{Style.BRIGHT}Prowler SaaS{Style.RESET_ALL} customer and you want to use your data from your S3 bucket,\nrun: `{orange_color}aws s3 cp s3://<your-bucket>/output/csv ./output --recursive{Style.RESET_ALL}`\nand then run `prowler dashboard` again to load the new files."
|
||||
)
|
||||
|
||||
# Initialize the app - incorporate css
|
||||
dashboard = dash.Dash(
|
||||
__name__,
|
||||
external_stylesheets=[dbc.themes.DARKLY],
|
||||
use_pages=True,
|
||||
suppress_callback_exceptions=True,
|
||||
title="Prowler Dashboard",
|
||||
)
|
||||
|
||||
# Logo
|
||||
prowler_logo = html.Img(
|
||||
src="https://prowler.com/wp-content/uploads/logo-dashboard.png", alt="Prowler Logo"
|
||||
)
|
||||
|
||||
menu_icons = {
|
||||
"overview": "/assets/images/icons/overview.svg",
|
||||
"compliance": "/assets/images/icons/compliance.svg",
|
||||
}
|
||||
|
||||
|
||||
# Function to generate navigation links
|
||||
def generate_nav_links(current_path):
|
||||
nav_links = []
|
||||
for page in dash.page_registry.values():
|
||||
# Gets the icon URL based on the page name
|
||||
icon_url = menu_icons.get(page["name"].lower())
|
||||
is_active = (
|
||||
" bg-prowler-stone-950 border-r-4 border-solid border-prowler-lime"
|
||||
if current_path == page["relative_path"]
|
||||
else ""
|
||||
)
|
||||
link_class = f"block hover:bg-prowler-stone-950 hover:border-r-4 hover:border-solid hover:border-prowler-lime{is_active}"
|
||||
|
||||
link_content = html.Span(
|
||||
[
|
||||
html.Img(src=icon_url, className="w-5"),
|
||||
html.Span(page["name"], className="font-medium text-base leading-6"),
|
||||
],
|
||||
className="flex justify-center lg:justify-normal items-center gap-x-3 py-2 px-3",
|
||||
)
|
||||
|
||||
nav_link = html.Li(
|
||||
dcc.Link(link_content, href=page["relative_path"], className=link_class)
|
||||
)
|
||||
nav_links.append(nav_link)
|
||||
return nav_links
|
||||
|
||||
|
||||
def generate_help_menu():
|
||||
help_links = [
|
||||
{
|
||||
"title": "Help",
|
||||
"url": "https://github.com/prowler-cloud/prowler/issues",
|
||||
"icon": "/assets/images/icons/help.png",
|
||||
},
|
||||
{
|
||||
"title": "Docs",
|
||||
"url": "https://docs.prowler.com",
|
||||
"icon": "/assets/images/icons/docs.png",
|
||||
},
|
||||
]
|
||||
|
||||
link_class = "block hover:bg-prowler-stone-950 hover:border-r-4 hover:border-solid hover:border-prowler-lime"
|
||||
|
||||
menu_items = []
|
||||
for link in help_links:
|
||||
menu_item = html.Li(
|
||||
html.A(
|
||||
html.Span(
|
||||
[
|
||||
html.Img(src=link["icon"], className="w-5"),
|
||||
html.Span(
|
||||
link["title"], className="font-medium text-base leading-6"
|
||||
),
|
||||
],
|
||||
className="flex items-center gap-x-3 py-2 px-3",
|
||||
),
|
||||
href=link["url"],
|
||||
target="_blank",
|
||||
className=link_class,
|
||||
)
|
||||
)
|
||||
menu_items.append(menu_item)
|
||||
|
||||
return menu_items
|
||||
|
||||
|
||||
# Layout
|
||||
dashboard.layout = html.Div(
|
||||
[
|
||||
dcc.Location(id="url", refresh=False),
|
||||
html.Link(rel="icon", href="assets/favicon.ico"),
|
||||
# Placeholder for dynamic navigation bar
|
||||
html.Div(
|
||||
[
|
||||
html.Div(
|
||||
id="navigation-bar", className="bg-prowler-stone-900 min-w-36 z-10"
|
||||
),
|
||||
html.Div(
|
||||
[
|
||||
dash.page_container,
|
||||
],
|
||||
id="content_select",
|
||||
className="bg-prowler-white w-full col-span-11 h-screen mx-auto overflow-y-scroll no-scrollbar px-10 py-7",
|
||||
),
|
||||
],
|
||||
className="grid custom-grid 2xl:custom-grid-large h-screen",
|
||||
),
|
||||
],
|
||||
className="h-screen mx-auto",
|
||||
)
|
||||
|
||||
|
||||
# Callback to update navigation bar
|
||||
@dashboard.callback(Output("navigation-bar", "children"), [Input("url", "pathname")])
|
||||
def update_nav_bar(pathname):
|
||||
return html.Div(
|
||||
[
|
||||
html.Div([prowler_logo], className="mb-8 px-3"),
|
||||
html.H6(
|
||||
"Dashboards",
|
||||
className="px-3 text-prowler-stone-500 text-sm opacity-90 font-regular mb-2",
|
||||
),
|
||||
html.Nav(
|
||||
[html.Ul(generate_nav_links(pathname), className="")],
|
||||
className="flex flex-col gap-y-6",
|
||||
),
|
||||
html.Nav(
|
||||
[
|
||||
html.A(
|
||||
[
|
||||
html.Span(
|
||||
[
|
||||
html.Img(src="assets/favicon.ico", className="w-5"),
|
||||
"Subscribe to prowler SaaS",
|
||||
],
|
||||
className="flex items-center gap-x-3",
|
||||
),
|
||||
],
|
||||
href="https://prowler.com/",
|
||||
target="_blank",
|
||||
className="block p-3 uppercase text-xs hover:bg-prowler-stone-950 hover:border-r-4 hover:border-solid hover:border-prowler-lime",
|
||||
),
|
||||
html.Ul(generate_help_menu(), className=""),
|
||||
],
|
||||
className="flex flex-col gap-y-6 mt-auto",
|
||||
),
|
||||
],
|
||||
className="flex flex-col bg-prowler-stone-900 py-7 h-full",
|
||||
)
|
||||
BIN
dashboard/assets/favicon.ico
Normal file
|
After Width: | Height: | Size: 15 KiB |
4
dashboard/assets/images/icons/compliance.svg
Normal file
@@ -0,0 +1,4 @@
|
||||
<svg xmlns="http://www.w3.org/2000/svg" fill="#FFF" aria-hidden="true" class="h-5 w-5" viewBox="0 0 24 24">
|
||||
<path fill-rule="evenodd" d="M9 1.5H5.625c-1.036 0-1.875.84-1.875 1.875v17.25c0 1.035.84 1.875 1.875 1.875h12.75c1.035 0 1.875-.84 1.875-1.875V12.75A3.75 3.75 0 0 0 16.5 9h-1.875a1.875 1.875 0 0 1-1.875-1.875V5.25A3.75 3.75 0 0 0 9 1.5zm6.61 10.936a.75.75 0 1 0-1.22-.872l-3.236 4.53L9.53 14.47a.75.75 0 0 0-1.06 1.06l2.25 2.25a.75.75 0 0 0 1.14-.094l3.75-5.25z" clip-rule="evenodd"/>
|
||||
<path d="M12.971 1.816A5.23 5.23 0 0 1 14.25 5.25v1.875c0 .207.168.375.375.375H16.5a5.23 5.23 0 0 1 3.434 1.279 9.768 9.768 0 0 0-6.963-6.963z"/>
|
||||
</svg>
|
||||
|
After Width: | Height: | Size: 650 B |
BIN
dashboard/assets/images/icons/docs.png
Normal file
|
After Width: | Height: | Size: 734 B |
BIN
dashboard/assets/images/icons/help-black.png
Normal file
|
After Width: | Height: | Size: 441 B |
BIN
dashboard/assets/images/icons/help.png
Normal file
|
After Width: | Height: | Size: 934 B |
4
dashboard/assets/images/icons/overview.svg
Normal file
@@ -0,0 +1,4 @@
|
||||
<svg xmlns="http://www.w3.org/2000/svg" fill="#FFF" aria-hidden="true" class="h-5 w-5" viewBox="0 0 24 24">
|
||||
<path fill-rule="evenodd" d="M2.25 13.5a8.25 8.25 0 0 1 8.25-8.25.75.75 0 0 1 .75.75v6.75H18a.75.75 0 0 1 .75.75 8.25 8.25 0 0 1-16.5 0z" clip-rule="evenodd"/>
|
||||
<path fill-rule="evenodd" d="M12.75 3a.75.75 0 0 1 .75-.75 8.25 8.25 0 0 1 8.25 8.25.75.75 0 0 1-.75.75h-7.5a.75.75 0 0 1-.75-.75V3z" clip-rule="evenodd"/>
|
||||
</svg>
|
||||
|
After Width: | Height: | Size: 435 B |
BIN
dashboard/assets/images/providers/aws_provider.png
Normal file
|
After Width: | Height: | Size: 10 KiB |
BIN
dashboard/assets/images/providers/azure_provider.png
Normal file
|
After Width: | Height: | Size: 6.0 KiB |
BIN
dashboard/assets/images/providers/gcp_provider.png
Normal file
|
After Width: | Height: | Size: 245 KiB |
BIN
dashboard/assets/images/providers/k8s_provider.png
Normal file
|
After Width: | Height: | Size: 15 KiB |
BIN
dashboard/assets/logo.png
Normal file
|
After Width: | Height: | Size: 11 KiB |
1387
dashboard/assets/styles/dist/output.css
vendored
Normal file
2221
dashboard/common_methods.py
Normal file
23
dashboard/compliance/aws_account_security_onboarding_aws.py
Normal file
@@ -0,0 +1,23 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_format2
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_format2(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ID"
|
||||
)
|
||||
@@ -0,0 +1,23 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_format1
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_format1(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ID"
|
||||
)
|
||||
@@ -0,0 +1,23 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_format1
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_format1(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ID"
|
||||
)
|
||||
@@ -0,0 +1,22 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_format2
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ATTRIBUTES_NAME",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
]
|
||||
return get_section_containers_format2(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_NAME", "REQUIREMENTS_ATTRIBUTES_SECTION"
|
||||
)
|
||||
@@ -0,0 +1,23 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_format2
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ATTRIBUTES_NAME",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
]
|
||||
|
||||
return get_section_containers_format2(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ATTRIBUTES_NAME"
|
||||
)
|
||||
24
dashboard/compliance/cis_1_4_aws.py
Normal file
@@ -0,0 +1,24 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_cis
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_cis(
|
||||
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
|
||||
)
|
||||
24
dashboard/compliance/cis_1_5_aws.py
Normal file
@@ -0,0 +1,24 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_cis
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_cis(
|
||||
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
|
||||
)
|
||||
24
dashboard/compliance/cis_1_8_kubernetes.py
Normal file
@@ -0,0 +1,24 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_cis
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_cis(
|
||||
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
|
||||
)
|
||||
24
dashboard/compliance/cis_2_0_aws.py
Normal file
@@ -0,0 +1,24 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_cis
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_cis(
|
||||
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
|
||||
)
|
||||
24
dashboard/compliance/cis_2_0_azure.py
Normal file
@@ -0,0 +1,24 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_cis
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_cis(
|
||||
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
|
||||
)
|
||||
24
dashboard/compliance/cis_2_0_gcp.py
Normal file
@@ -0,0 +1,24 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_cis
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_cis(
|
||||
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
|
||||
)
|
||||
24
dashboard/compliance/cis_2_1_azure.py
Normal file
@@ -0,0 +1,24 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_cis
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_cis(
|
||||
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
|
||||
)
|
||||
24
dashboard/compliance/cis_3_0_aws.py
Normal file
@@ -0,0 +1,24 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_cis
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_cis(
|
||||
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
|
||||
)
|
||||
23
dashboard/compliance/cisa_aws.py
Normal file
@@ -0,0 +1,23 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_format1
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_format1(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ID"
|
||||
)
|
||||
29
dashboard/compliance/ens_rd2022_aws.py
Normal file
@@ -0,0 +1,29 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_ens
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ATTRIBUTES_MARCO",
|
||||
"REQUIREMENTS_ATTRIBUTES_CATEGORIA",
|
||||
"REQUIREMENTS_ATTRIBUTES_IDGRUPOCONTROL",
|
||||
"REQUIREMENTS_ATTRIBUTES_TIPO",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
]
|
||||
|
||||
return get_section_containers_ens(
|
||||
aux,
|
||||
"REQUIREMENTS_ATTRIBUTES_MARCO",
|
||||
"REQUIREMENTS_ATTRIBUTES_CATEGORIA",
|
||||
"REQUIREMENTS_ATTRIBUTES_IDGRUPOCONTROL",
|
||||
"REQUIREMENTS_ATTRIBUTES_TIPO",
|
||||
)
|
||||
24
dashboard/compliance/fedramp_low_revision_4_aws.py
Normal file
@@ -0,0 +1,24 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_format3
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_format3(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ID"
|
||||
)
|
||||
24
dashboard/compliance/fedramp_moderate_revision_4_aws.py
Normal file
@@ -0,0 +1,24 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_format3
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_format3(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ID"
|
||||
)
|
||||
24
dashboard/compliance/ffiec_aws.py
Normal file
@@ -0,0 +1,24 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_format3
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_format3(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ID"
|
||||
)
|
||||
23
dashboard/compliance/gdpr_aws.py
Normal file
@@ -0,0 +1,23 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_format1
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_format1(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ID"
|
||||
)
|
||||
24
dashboard/compliance/gxp_21_cfr_part_11_aws.py
Normal file
@@ -0,0 +1,24 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_format3
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_format3(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ID"
|
||||
)
|
||||
23
dashboard/compliance/gxp_eu_annex_11_aws.py
Normal file
@@ -0,0 +1,23 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_format1
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_format1(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ID"
|
||||
)
|
||||
24
dashboard/compliance/hipaa_aws.py
Normal file
@@ -0,0 +1,24 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_format3
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_format3(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ID"
|
||||
)
|
||||
23
dashboard/compliance/iso27001_2013_aws.py
Normal file
@@ -0,0 +1,23 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_container_iso
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ATTRIBUTES_CATEGORY",
|
||||
"REQUIREMENTS_ATTRIBUTES_OBJETIVE_ID",
|
||||
"REQUIREMENTS_ATTRIBUTES_OBJETIVE_NAME",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
]
|
||||
return get_section_container_iso(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_CATEGORY", "REQUIREMENTS_ATTRIBUTES_OBJETIVE_ID"
|
||||
)
|
||||
23
dashboard/compliance/mitre_attack_aws.py
Normal file
@@ -0,0 +1,23 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_format2
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_SUBTECHNIQUES",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_format2(
|
||||
aux, "REQUIREMENTS_ID", "REQUIREMENTS_SUBTECHNIQUES"
|
||||
)
|
||||
24
dashboard/compliance/nist_800_171_revision_2_aws.py
Normal file
@@ -0,0 +1,24 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_format3
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_format3(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ID"
|
||||
)
|
||||
24
dashboard/compliance/nist_800_53_revision_4_aws.py
Normal file
@@ -0,0 +1,24 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_format3
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_format3(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ID"
|
||||
)
|
||||
24
dashboard/compliance/nist_800_53_revision_5_aws.py
Normal file
@@ -0,0 +1,24 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_format3
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_format3(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ID"
|
||||
)
|
||||
24
dashboard/compliance/nist_csf_1_1_aws.py
Normal file
@@ -0,0 +1,24 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_format3
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_format3(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ID"
|
||||
)
|
||||
20
dashboard/compliance/pci_3_2_1_aws.py
Normal file
@@ -0,0 +1,20 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_pci
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
]
|
||||
|
||||
return get_section_containers_pci(aux, "REQUIREMENTS_ID")
|
||||
20
dashboard/compliance/rbi_cyber_security_framework_aws.py
Normal file
@@ -0,0 +1,20 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_rbi
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
]
|
||||
return get_section_containers_rbi(aux, "REQUIREMENTS_ID")
|
||||
24
dashboard/compliance/soc2_aws.py
Normal file
@@ -0,0 +1,24 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_format3
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_format3(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ID"
|
||||
)
|
||||
29
dashboard/config.py
Normal file
@@ -0,0 +1,29 @@
|
||||
import os
|
||||
|
||||
# Emojis to be used in the compliance table
|
||||
pass_emoji = "✅"
|
||||
fail_emoji = "❌"
|
||||
info_emoji = "ℹ️"
|
||||
manual_emoji = "✋🏽"
|
||||
|
||||
# Main colors
|
||||
fail_color = "#e67272"
|
||||
pass_color = "#54d283"
|
||||
info_color = "#2684FF"
|
||||
manual_color = "#636c78"
|
||||
|
||||
# Muted colors
|
||||
muted_fail_color = "#fca903"
|
||||
muted_pass_color = "#03fccf"
|
||||
muted_manual_color = "#b33696"
|
||||
|
||||
# Severity colors
|
||||
critical_color = "#951649"
|
||||
high_color = "#e11d48"
|
||||
medium_color = "#ee6f15"
|
||||
low_color = "#f9f5e6"
|
||||
informational_color = "#3274d9"
|
||||
|
||||
# Folder output path
|
||||
folder_path_overview = os.getcwd() + "/output"
|
||||
folder_path_compliance = os.getcwd() + "/output/compliance"
|
||||
157
dashboard/lib/cards.py
Normal file
@@ -0,0 +1,157 @@
|
||||
from typing import List
|
||||
|
||||
from dash import html
|
||||
|
||||
|
||||
def create_provider_card(
|
||||
provider: str, provider_logo: str, account_type: str, filtered_data
|
||||
) -> List[html.Div]:
|
||||
"""
|
||||
Card to display the provider's name and icon.
|
||||
Args:
|
||||
provider (str): Name of the provider.
|
||||
provider_icon (str): Icon of the provider.
|
||||
Returns:
|
||||
html.Div: Card to display the provider's name and icon.
|
||||
"""
|
||||
accounts = len(
|
||||
filtered_data[filtered_data["PROVIDER"] == provider]["ACCOUNT_UID"].unique()
|
||||
)
|
||||
checks_executed = len(
|
||||
filtered_data[filtered_data["PROVIDER"] == provider]["CHECK_ID"].unique()
|
||||
)
|
||||
fails = len(
|
||||
filtered_data[
|
||||
(filtered_data["PROVIDER"] == provider)
|
||||
& (filtered_data["STATUS"] == "FAIL")
|
||||
]
|
||||
)
|
||||
passes = len(
|
||||
filtered_data[
|
||||
(filtered_data["PROVIDER"] == provider)
|
||||
& (filtered_data["STATUS"] == "PASS")
|
||||
]
|
||||
)
|
||||
# Take the values in the MUTED colum that are true for the provider
|
||||
if "MUTED" in filtered_data.columns:
|
||||
muted = len(
|
||||
filtered_data[
|
||||
(filtered_data["PROVIDER"] == provider)
|
||||
& (filtered_data["MUTED"] == "True")
|
||||
]
|
||||
)
|
||||
else:
|
||||
muted = 0
|
||||
|
||||
return [
|
||||
html.Div(
|
||||
[
|
||||
html.Div(
|
||||
[
|
||||
html.Div(
|
||||
[
|
||||
html.Div(
|
||||
[
|
||||
html.Div([provider_logo], className="w-8"),
|
||||
],
|
||||
className="p-2 shadow-box-up rounded-full",
|
||||
),
|
||||
html.H5(
|
||||
f"{provider.upper()} {account_type}",
|
||||
className="text-base font-semibold leading-snug tracking-normal text-gray-900",
|
||||
),
|
||||
],
|
||||
className="flex justify-between items-center mb-3",
|
||||
),
|
||||
html.Div(
|
||||
[
|
||||
html.Div(
|
||||
[
|
||||
html.Span(
|
||||
account_type,
|
||||
className="text-prowler-stone-900 inline-block text-3xs font-bold uppercase transition-all rounded-lg text-prowler-stone-900 shadow-box-up px-4 py-1 text-center col-span-6 flex justify-center items-center",
|
||||
),
|
||||
html.Div(
|
||||
accounts,
|
||||
className="inline-block text-xs text-prowler-stone-900 font-bold shadow-box-down px-4 py-1 rounded-lg text-center col-span-5 col-end-13",
|
||||
),
|
||||
],
|
||||
className="grid grid-cols-12",
|
||||
),
|
||||
html.Div(
|
||||
[
|
||||
html.Span(
|
||||
"Checks",
|
||||
className="text-prowler-stone-900 inline-block text-3xs font-bold uppercase transition-all rounded-lg text-prowler-stone-900 shadow-box-up px-4 py-1 text-center col-span-6 flex justify-center items-center",
|
||||
),
|
||||
html.Div(
|
||||
checks_executed,
|
||||
className="inline-block text-xs text-prowler-stone-900 font-bold shadow-box-down px-4 py-1 rounded-lg text-center col-span-5 col-end-13",
|
||||
),
|
||||
],
|
||||
className="grid grid-cols-12",
|
||||
),
|
||||
html.Div(
|
||||
[
|
||||
html.Span(
|
||||
"FAILED",
|
||||
className="text-prowler-stone-900 inline-block text-3xs font-bold uppercase transition-all rounded-lg text-prowler-stone-900 shadow-box-up px-4 py-1 text-center col-span-6 flex justify-center items-center",
|
||||
),
|
||||
html.Div(
|
||||
[
|
||||
html.Div(
|
||||
fails,
|
||||
className="m-[2px] px-4 py-1 rounded-lg bg-gradient-failed",
|
||||
),
|
||||
],
|
||||
className="inline-block text-xs font-bold shadow-box-down rounded-lg text-center col-span-5 col-end-13",
|
||||
),
|
||||
],
|
||||
className="grid grid-cols-12",
|
||||
),
|
||||
html.Div(
|
||||
[
|
||||
html.Span(
|
||||
"PASSED",
|
||||
className="text-prowler-stone-900 inline-block text-3xs font-bold uppercase transition-all rounded-lg text-prowler-stone-900 shadow-box-up px-4 py-1 text-center col-span-6 flex justify-center items-center",
|
||||
),
|
||||
html.Div(
|
||||
[
|
||||
html.Div(
|
||||
passes,
|
||||
className="m-[2px] px-4 py-1 rounded-lg bg-gradient-passed",
|
||||
),
|
||||
],
|
||||
className="inline-block text-xs font-bold shadow-box-down rounded-lg text-center col-span-5 col-end-13",
|
||||
),
|
||||
],
|
||||
className="grid grid-cols-12",
|
||||
),
|
||||
html.Div(
|
||||
[
|
||||
html.Span(
|
||||
"MUTED",
|
||||
className="text-prowler-stone-900 inline-block text-3xs font-bold uppercase transition-all rounded-lg text-prowler-stone-900 shadow-box-up px-4 py-1 text-center col-span-6 flex justify-center items-center",
|
||||
),
|
||||
html.Div(
|
||||
[
|
||||
html.Div(
|
||||
muted,
|
||||
className="m-[2px] px-4 py-1 rounded-lg bg-gradient-muted",
|
||||
),
|
||||
],
|
||||
className="inline-block text-xs font-bold shadow-box-down rounded-lg text-center col-span-5 col-end-13",
|
||||
),
|
||||
],
|
||||
className="grid grid-cols-12",
|
||||
),
|
||||
],
|
||||
className="grid gap-x-8 gap-y-4",
|
||||
),
|
||||
],
|
||||
className="px-4 py-3",
|
||||
),
|
||||
],
|
||||
className="relative flex flex-col bg-white shadow-provider rounded-xl w-full transition ease-in-out delay-100 hover:-translate-y-1 hover:scale-110 hover:z-50 hover:cursor-pointer",
|
||||
)
|
||||
]
|
||||
289
dashboard/lib/dropdowns.py
Normal file
@@ -0,0 +1,289 @@
|
||||
from dash import dcc, html
|
||||
|
||||
|
||||
def create_date_dropdown(assesment_times: list) -> html.Div:
|
||||
"""
|
||||
Dropdown to select the date of the last available scan for each account.
|
||||
Args:
|
||||
assesment_times (list): List of dates of the last available scan for each account.
|
||||
Returns:
|
||||
html.Div: Dropdown to select the date of the last available scan for each account.
|
||||
"""
|
||||
return html.Div(
|
||||
[
|
||||
html.Div(
|
||||
[
|
||||
html.Label(
|
||||
"Assessment date (last available scan) ",
|
||||
className="text-prowler-stone-900 font-bold text-sm",
|
||||
),
|
||||
html.Img(
|
||||
id="info-file-over",
|
||||
src="/assets/images/icons/help-black.png",
|
||||
className="w-5",
|
||||
title="The date of the last available scan for each account is displayed here. If you have not run prowler yet, the date will be empty.",
|
||||
),
|
||||
],
|
||||
style={"display": "inline-flex"},
|
||||
),
|
||||
dcc.Dropdown(
|
||||
id="report-date-filter",
|
||||
options=[
|
||||
{"label": account, "value": account} for account in assesment_times
|
||||
],
|
||||
value=assesment_times[0],
|
||||
clearable=False,
|
||||
multi=False,
|
||||
style={"color": "#000000", "width": "100%"},
|
||||
),
|
||||
],
|
||||
)
|
||||
|
||||
|
||||
def create_date_dropdown_compliance(assesment_times: list) -> html.Div:
|
||||
"""
|
||||
Dropdown to select the date of the last available scan for each account.
|
||||
Args:
|
||||
assesment_times (list): List of dates of the last available scan for each account.
|
||||
Returns:
|
||||
html.Div: Dropdown to select the date of the last available scan for each account.
|
||||
"""
|
||||
return html.Div(
|
||||
[
|
||||
html.Label(
|
||||
"Assesment Date:", className="text-prowler-stone-900 font-bold text-sm"
|
||||
),
|
||||
dcc.Dropdown(
|
||||
id="date-filter-analytics",
|
||||
options=[
|
||||
{"label": account, "value": account} for account in assesment_times
|
||||
],
|
||||
value=assesment_times[0],
|
||||
clearable=False,
|
||||
multi=False,
|
||||
style={"color": "#000000", "width": "100%"},
|
||||
),
|
||||
],
|
||||
)
|
||||
|
||||
|
||||
def create_region_dropdown(regions: list) -> html.Div:
|
||||
"""
|
||||
Dropdown to select the region of the account.
|
||||
Args:
|
||||
regions (list): List of regions of the account.
|
||||
Returns:
|
||||
html.Div: Dropdown to select the region of the account.
|
||||
"""
|
||||
return html.Div(
|
||||
[
|
||||
html.Label(
|
||||
"Region / Location / Namespace :",
|
||||
className="text-prowler-stone-900 font-bold text-sm",
|
||||
),
|
||||
dcc.Dropdown(
|
||||
id="region-filter",
|
||||
options=[{"label": region, "value": region} for region in regions],
|
||||
value=["All"], # Initial selection is ALL
|
||||
clearable=False,
|
||||
multi=True,
|
||||
style={"color": "#000000", "width": "100%"},
|
||||
),
|
||||
],
|
||||
)
|
||||
|
||||
|
||||
def create_region_dropdown_compliance(regions: list) -> html.Div:
|
||||
"""
|
||||
Dropdown to select the region of the account.
|
||||
Args:
|
||||
regions (list): List of regions of the account.
|
||||
Returns:
|
||||
html.Div: Dropdown to select the region of the account.
|
||||
"""
|
||||
return html.Div(
|
||||
[
|
||||
html.Label(
|
||||
"Region / Location / Namespace :",
|
||||
className="text-prowler-stone-900 font-bold text-sm",
|
||||
),
|
||||
dcc.Dropdown(
|
||||
id="region-filter-compliance",
|
||||
options=[{"label": region, "value": region} for region in regions],
|
||||
value=["All"], # Initial selection is ALL
|
||||
clearable=False,
|
||||
multi=True,
|
||||
style={"color": "#000000", "width": "100%"},
|
||||
),
|
||||
],
|
||||
)
|
||||
|
||||
|
||||
def create_account_dropdown(accounts: list) -> html.Div:
|
||||
"""
|
||||
Dropdown to select the account.
|
||||
Args:
|
||||
accounts (list): List of accounts.
|
||||
Returns:
|
||||
html.Div: Dropdown to select the account.
|
||||
"""
|
||||
return html.Div(
|
||||
[
|
||||
html.Label(
|
||||
"Account / Subscription / Project / Cluster :",
|
||||
className="text-prowler-stone-900 font-bold text-sm",
|
||||
),
|
||||
dcc.Dropdown(
|
||||
id="cloud-account-filter",
|
||||
options=[{"label": account, "value": account} for account in accounts],
|
||||
value=["All"], # Initial selection is ALL
|
||||
clearable=False,
|
||||
multi=True,
|
||||
style={"color": "#000000", "width": "100%"},
|
||||
),
|
||||
],
|
||||
)
|
||||
|
||||
|
||||
def create_account_dropdown_compliance(accounts: list) -> html.Div:
|
||||
"""
|
||||
Dropdown to select the account.
|
||||
Args:
|
||||
accounts (list): List of accounts.
|
||||
Returns:
|
||||
html.Div: Dropdown to select the account.
|
||||
"""
|
||||
return html.Div(
|
||||
[
|
||||
html.Label(
|
||||
"Account / Subscription / Project / Cluster :",
|
||||
className="text-prowler-stone-900 font-bold text-sm",
|
||||
),
|
||||
dcc.Dropdown(
|
||||
id="cloud-account-filter-compliance",
|
||||
options=[{"label": account, "value": account} for account in accounts],
|
||||
value=["All"], # Initial selection is ALL
|
||||
clearable=False,
|
||||
multi=True,
|
||||
style={"color": "#000000", "width": "100%"},
|
||||
),
|
||||
],
|
||||
)
|
||||
|
||||
|
||||
def create_compliance_dropdown(compliance: list) -> html.Div:
|
||||
"""
|
||||
Dropdown to select the compliance.
|
||||
Args:
|
||||
compliance (list): List of compliance.
|
||||
Returns:
|
||||
html.Div: Dropdown to select the compliance.
|
||||
"""
|
||||
return html.Div(
|
||||
[
|
||||
html.Label(
|
||||
"Compliance:", className="text-prowler-stone-900 font-bold text-sm"
|
||||
),
|
||||
dcc.Dropdown(
|
||||
id="report-compliance-filter",
|
||||
options=[{"label": i, "value": i} for i in compliance],
|
||||
value=compliance[0],
|
||||
clearable=False,
|
||||
style={"color": "#000000"},
|
||||
),
|
||||
],
|
||||
)
|
||||
|
||||
|
||||
def create_severity_dropdown(severity: list) -> html.Div:
|
||||
"""
|
||||
Dropdown to select the severity.
|
||||
Args:
|
||||
severity (list): List of severity.
|
||||
Returns:
|
||||
html.Div: Dropdown to select the severity.
|
||||
"""
|
||||
return html.Div(
|
||||
[
|
||||
html.Label(
|
||||
"Severity:", className="text-prowler-stone-900 font-bold text-sm"
|
||||
),
|
||||
dcc.Dropdown(
|
||||
id="severity-filter",
|
||||
options=[{"label": i, "value": i} for i in severity],
|
||||
value=["All"],
|
||||
clearable=False,
|
||||
multi=True,
|
||||
style={"color": "#000000"},
|
||||
),
|
||||
],
|
||||
)
|
||||
|
||||
|
||||
def create_service_dropdown(services: list) -> html.Div:
|
||||
"""
|
||||
Dropdown to select the service.
|
||||
Args:
|
||||
services (list): List of services.
|
||||
Returns:
|
||||
html.Div: Dropdown to select the service.
|
||||
"""
|
||||
return html.Div(
|
||||
[
|
||||
html.Label(
|
||||
"Service:", className="text-prowler-stone-900 font-bold text-sm"
|
||||
),
|
||||
dcc.Dropdown(
|
||||
id="service-filter",
|
||||
options=[{"label": i, "value": i} for i in services],
|
||||
value=["All"],
|
||||
clearable=False,
|
||||
multi=True,
|
||||
style={"color": "#000000"},
|
||||
),
|
||||
],
|
||||
)
|
||||
|
||||
|
||||
def create_status_dropdown(status: list) -> html.Div:
|
||||
"""
|
||||
Dropdown to select the status.
|
||||
Args:
|
||||
status (list): List of status.
|
||||
Returns:
|
||||
html.Div: Dropdown to select the status.
|
||||
"""
|
||||
return html.Div(
|
||||
[
|
||||
html.Label("Status:", className="text-prowler-stone-900 font-bold text-sm"),
|
||||
dcc.Dropdown(
|
||||
id="status-filter",
|
||||
options=[{"label": i, "value": i} for i in status],
|
||||
value=["All"],
|
||||
clearable=False,
|
||||
multi=True,
|
||||
style={"color": "#000000"},
|
||||
),
|
||||
],
|
||||
)
|
||||
|
||||
|
||||
def create_table_row_dropdown(table_rows: list) -> html.Div:
|
||||
"""
|
||||
Dropdown to select the number of rows in the table.
|
||||
Args:
|
||||
table_rows (list): List of number of rows.
|
||||
Returns:
|
||||
html.Div: Dropdown to select the number of rows in the table.
|
||||
"""
|
||||
return html.Div(
|
||||
[
|
||||
dcc.Dropdown(
|
||||
id="table-rows",
|
||||
options=[{"label": i, "value": i} for i in table_rows],
|
||||
value=table_rows[0],
|
||||
clearable=False,
|
||||
style={"color": "#000000", "margin-right": "10px"},
|
||||
),
|
||||
],
|
||||
)
|
||||
172
dashboard/lib/layouts.py
Normal file
@@ -0,0 +1,172 @@
|
||||
from dash import dcc, html
|
||||
|
||||
|
||||
def create_layout_overview(
|
||||
account_dropdown: html.Div,
|
||||
date_dropdown: html.Div,
|
||||
region_dropdown: html.Div,
|
||||
download_button: html.Button,
|
||||
severity_dropdown: html.Div,
|
||||
service_dropdown: html.Div,
|
||||
table_row_dropdown: html.Div,
|
||||
status_dropdown: html.Div,
|
||||
) -> html.Div:
|
||||
"""
|
||||
Create the layout of the dashboard.
|
||||
Args:
|
||||
account_dropdown (html.Div): Dropdown to select the account.
|
||||
date_dropdown (html.Div): Dropdown to select the date of the last available scan for each account.
|
||||
region_dropdown (html.Div): Dropdown to select the region of the account.
|
||||
Returns:
|
||||
html.Div: Layout of the dashboard.
|
||||
"""
|
||||
return html.Div(
|
||||
[
|
||||
dcc.Location(id="url", refresh=False),
|
||||
html.Div(
|
||||
[
|
||||
html.H1(
|
||||
"Scan Overview",
|
||||
className="text-prowler-stone-900 text-2xxl font-bold",
|
||||
),
|
||||
html.Div(className="d-flex flex-wrap", id="subscribe_card"),
|
||||
],
|
||||
className="flex justify-between border-b border-prowler-500 pb-3",
|
||||
),
|
||||
html.Div(
|
||||
[
|
||||
html.Div([date_dropdown], className=""),
|
||||
html.Div([account_dropdown], className=""),
|
||||
html.Div([region_dropdown], className=""),
|
||||
],
|
||||
className="grid gap-x-4 gap-y-4 sm:grid-cols-2 lg:grid-cols-3 lg:gap-y-0",
|
||||
),
|
||||
html.Div(
|
||||
[
|
||||
html.Div([severity_dropdown], className=""),
|
||||
html.Div([service_dropdown], className=""),
|
||||
html.Div([status_dropdown], className=""),
|
||||
],
|
||||
className="grid gap-x-4 gap-y-4 sm:grid-cols-2 lg:grid-cols-3 lg:gap-y-0",
|
||||
),
|
||||
html.Div(
|
||||
[
|
||||
html.Div(className="flex", id="aws_card", n_clicks=0),
|
||||
html.Div(className="flex", id="azure_card", n_clicks=0),
|
||||
html.Div(className="flex", id="gcp_card", n_clicks=0),
|
||||
html.Div(className="flex", id="k8s_card", n_clicks=0),
|
||||
],
|
||||
className="grid gap-x-4 gap-y-4 sm:grid-cols-2 lg:grid-cols-4 lg:gap-y-0",
|
||||
),
|
||||
html.H4(
|
||||
"Count of Findings by severity",
|
||||
className="text-prowler-stone-900 text-lg font-bold",
|
||||
),
|
||||
html.Div(
|
||||
[
|
||||
html.Div(
|
||||
className="flex flex-col col-span-12 sm:col-span-6 lg:col-span-3 gap-y-4",
|
||||
id="status_graph",
|
||||
),
|
||||
html.Div(
|
||||
className="flex flex-col col-span-12 sm:col-span-6 lg:col-span-3 gap-y-4",
|
||||
id="two_pie_chart",
|
||||
),
|
||||
html.Div(
|
||||
className="flex flex-col col-span-12 sm:col-span-6 lg:col-span-6 col-end-13 gap-y-4",
|
||||
id="line_plot",
|
||||
),
|
||||
],
|
||||
className="grid gap-x-4 gap-y-4 grid-cols-12 lg:gap-y-0",
|
||||
),
|
||||
html.Div(
|
||||
[
|
||||
html.H4(
|
||||
"Top Findings by Severity",
|
||||
className="text-prowler-stone-900 text-lg font-bold",
|
||||
),
|
||||
html.Div(
|
||||
[
|
||||
(
|
||||
html.Label(
|
||||
"Table Rows:",
|
||||
className="text-prowler-stone-900 font-bold text-sm",
|
||||
style={"margin-right": "10px"},
|
||||
)
|
||||
),
|
||||
table_row_dropdown,
|
||||
download_button,
|
||||
],
|
||||
className="flex justify-between items-center",
|
||||
),
|
||||
dcc.Download(id="download-data"),
|
||||
],
|
||||
className="flex justify-between items-center",
|
||||
),
|
||||
html.Div(id="table", className="grid"),
|
||||
],
|
||||
className="grid gap-x-8 gap-y-8 2xl:container mx-auto",
|
||||
)
|
||||
|
||||
|
||||
def create_layout_compliance(
|
||||
account_dropdown: html.Div,
|
||||
date_dropdown: html.Div,
|
||||
region_dropdown: html.Div,
|
||||
compliance_dropdown: html.Div,
|
||||
) -> html.Div:
|
||||
return html.Div(
|
||||
[
|
||||
dcc.Location(id="url", refresh=False),
|
||||
html.Div(
|
||||
[
|
||||
html.H1(
|
||||
"Compliance",
|
||||
className="text-prowler-stone-900 text-2xxl font-bold",
|
||||
),
|
||||
html.A(
|
||||
[
|
||||
html.Img(src="assets/favicon.ico", className="w-5 mr-3"),
|
||||
html.Span("Subscribe to prowler SaaS"),
|
||||
],
|
||||
href="https://prowler.pro/",
|
||||
target="_blank",
|
||||
className="text-prowler-stone-900 inline-flex px-4 py-2 text-xs font-bold uppercase transition-all rounded-lg text-gray-900 hover:bg-prowler-stone-900/10 border-solid border-1 hover:border-prowler-stone-900/10 hover:border-solid hover:border-1 border-prowler-stone-900/10",
|
||||
),
|
||||
],
|
||||
className="flex justify-between border-b border-prowler-500 pb-3",
|
||||
),
|
||||
html.Div(
|
||||
[
|
||||
html.Div([date_dropdown], className=""),
|
||||
html.Div([account_dropdown], className=""),
|
||||
html.Div([region_dropdown], className=""),
|
||||
html.Div([compliance_dropdown], className=""),
|
||||
],
|
||||
className="grid gap-x-4 gap-y-4 sm:grid-cols-2 lg:grid-cols-4 lg:gap-y-0",
|
||||
),
|
||||
html.Div(
|
||||
[
|
||||
html.Div(
|
||||
className="flex flex-col col-span-12 md:col-span-4 gap-y-4",
|
||||
id="overall_status_result_graph",
|
||||
),
|
||||
html.Div(
|
||||
className="flex flex-col col-span-12 md:col-span-7 md:col-end-13 gap-y-4",
|
||||
id="security_level_graph",
|
||||
),
|
||||
html.Div(
|
||||
className="flex flex-col col-span-12 md:col-span-2 gap-y-4",
|
||||
id="",
|
||||
),
|
||||
],
|
||||
className="grid gap-x-4 gap-y-4 grid-cols-12 lg:gap-y-0",
|
||||
),
|
||||
html.H4(
|
||||
"Details compliance:",
|
||||
className="text-prowler-stone-900 text-lg font-bold",
|
||||
),
|
||||
html.Div(className="flex flex-wrap", id="output"),
|
||||
],
|
||||
className="grid gap-x-8 gap-y-8 2xl:container mx-auto",
|
||||
)
|
||||
592
dashboard/pages/compliance.py
Normal file
@@ -0,0 +1,592 @@
|
||||
# Standard library imports
|
||||
import csv
|
||||
import glob
|
||||
import importlib
|
||||
import os
|
||||
import re
|
||||
import warnings
|
||||
|
||||
# Third-party imports
|
||||
import dash
|
||||
import pandas as pd
|
||||
import plotly.express as px
|
||||
from dash import callback, dcc, html
|
||||
from dash.dependencies import Input, Output
|
||||
|
||||
# Config import
|
||||
from dashboard.config import (
|
||||
fail_color,
|
||||
folder_path_compliance,
|
||||
info_color,
|
||||
manual_color,
|
||||
pass_color,
|
||||
)
|
||||
from dashboard.lib.dropdowns import (
|
||||
create_account_dropdown_compliance,
|
||||
create_compliance_dropdown,
|
||||
create_date_dropdown_compliance,
|
||||
create_region_dropdown_compliance,
|
||||
)
|
||||
from dashboard.lib.layouts import create_layout_compliance
|
||||
|
||||
# Suppress warnings
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
# Global variables
|
||||
# TODO: Create a flag to let the user put a custom path
|
||||
|
||||
csv_files = []
|
||||
for file in glob.glob(os.path.join(folder_path_compliance, "*.csv")):
|
||||
with open(file, "r", newline="") as csvfile:
|
||||
reader = csv.reader(csvfile)
|
||||
num_rows = sum(1 for row in reader)
|
||||
if num_rows > 1:
|
||||
csv_files.append(file)
|
||||
|
||||
|
||||
def load_csv_files(csv_files):
|
||||
# Load CSV files into a single pandas DataFrame.
|
||||
dfs = []
|
||||
results = []
|
||||
for file in csv_files:
|
||||
df = pd.read_csv(file, sep=";", on_bad_lines="skip")
|
||||
if "CHECKID" in df.columns:
|
||||
dfs.append(df)
|
||||
result = file
|
||||
result = result.split("/")[-1]
|
||||
result = re.sub(r"^.*?_", "", result)
|
||||
result = result.replace(".csv", "")
|
||||
result = result.upper()
|
||||
if "AWS" in result:
|
||||
if "AWS_" in result:
|
||||
result = result.replace("_AWS", "")
|
||||
else:
|
||||
result = result.replace("_AWS", " - AWS")
|
||||
if "GCP" in result:
|
||||
result = result.replace("_GCP", " - GCP")
|
||||
if "AZURE" in result:
|
||||
result = result.replace("_AZURE", " - AZURE")
|
||||
if "KUBERNETES" in result:
|
||||
result = result.replace("_KUBERNETES", " - KUBERNETES")
|
||||
result = result[result.find("CIS_") :]
|
||||
results.append(result)
|
||||
|
||||
unique_results = set(results)
|
||||
results = list(unique_results)
|
||||
# Check if there is any CIS report in the list and divide it in level 1 and level 2
|
||||
new_results = []
|
||||
old_results = results.copy()
|
||||
for compliance_name in results:
|
||||
if "CIS_" in compliance_name:
|
||||
old_results.remove(compliance_name)
|
||||
new_results.append(compliance_name + " - Level_1")
|
||||
new_results.append(compliance_name + " - Level_2")
|
||||
|
||||
results = old_results + new_results
|
||||
results.sort()
|
||||
# Handle the case where there are no CSV files
|
||||
try:
|
||||
data = pd.concat(dfs, ignore_index=True)
|
||||
except ValueError:
|
||||
data = None
|
||||
return data, results
|
||||
|
||||
|
||||
data, results = load_csv_files(csv_files)
|
||||
|
||||
if data is None:
|
||||
dash.register_page(__name__)
|
||||
layout = html.Div(
|
||||
[
|
||||
html.Div(
|
||||
[
|
||||
html.H5(
|
||||
"No data found, check if the CSV files are in the correct folder.",
|
||||
className="card-title",
|
||||
style={"text-align": "left"},
|
||||
)
|
||||
],
|
||||
style={
|
||||
"width": "99%",
|
||||
"margin-right": "0.8%",
|
||||
"margin-bottom": "10px",
|
||||
},
|
||||
)
|
||||
]
|
||||
)
|
||||
else:
|
||||
|
||||
data["ASSESSMENTDATE"] = pd.to_datetime(data["ASSESSMENTDATE"])
|
||||
data["ASSESSMENT_TIME"] = data["ASSESSMENTDATE"].dt.strftime("%Y-%m-%d %H:%M:%S")
|
||||
|
||||
data_values = data["ASSESSMENT_TIME"].unique()
|
||||
data_values.sort()
|
||||
data_values = data_values[::-1]
|
||||
aux = []
|
||||
for value in data_values:
|
||||
if value.split(" ")[0] not in [aux[i].split(" ")[0] for i in range(len(aux))]:
|
||||
aux.append(value)
|
||||
data_values = aux
|
||||
|
||||
data = data[data["ASSESSMENT_TIME"].isin(data_values)]
|
||||
data["ASSESSMENT_TIME"] = data["ASSESSMENT_TIME"].apply(lambda x: x.split(" ")[0])
|
||||
|
||||
# Select Compliance - Dropdown
|
||||
|
||||
compliance_dropdown = create_compliance_dropdown(results)
|
||||
|
||||
# Select Account - Dropdown
|
||||
|
||||
select_account_dropdown_list = ["All"]
|
||||
# Append to the list the unique values of the columns ACCOUNTID, PROJECTID and SUBSCRIPTIONID if they exist
|
||||
if "ACCOUNTID" in data.columns:
|
||||
select_account_dropdown_list = select_account_dropdown_list + list(
|
||||
data["ACCOUNTID"].unique()
|
||||
)
|
||||
if "PROJECTID" in data.columns:
|
||||
select_account_dropdown_list = select_account_dropdown_list + list(
|
||||
data["PROJECTID"].unique()
|
||||
)
|
||||
if "SUBSCRIPTIONID" in data.columns:
|
||||
select_account_dropdown_list = select_account_dropdown_list + list(
|
||||
data["SUBSCRIPTIONID"].unique()
|
||||
)
|
||||
if "SUBSCRIPTION" in data.columns:
|
||||
select_account_dropdown_list = select_account_dropdown_list + list(
|
||||
data["SUBSCRIPTION"].unique()
|
||||
)
|
||||
|
||||
list_items = []
|
||||
for item in select_account_dropdown_list:
|
||||
if item.__class__.__name__ == "str" and "nan" not in item:
|
||||
list_items.append(item)
|
||||
|
||||
account_dropdown = create_account_dropdown_compliance(list_items)
|
||||
|
||||
# Select Region - Dropdown
|
||||
|
||||
select_region_dropdown_list = ["All"]
|
||||
# Append to the list the unique values of the column REGION or LOCATION if it exists
|
||||
if "REGION" in data.columns:
|
||||
# Handle the case where the column REGION is empty
|
||||
data["REGION"] = data["REGION"].fillna("-")
|
||||
select_region_dropdown_list = select_region_dropdown_list + list(
|
||||
data["REGION"].unique()
|
||||
)
|
||||
if "LOCATION" in data.columns:
|
||||
# Handle the case where the column LOCATION is empty
|
||||
data["LOCATION"] = data["LOCATION"].fillna("-")
|
||||
select_region_dropdown_list = select_region_dropdown_list + list(
|
||||
data["LOCATION"].unique()
|
||||
)
|
||||
|
||||
# Clear the list from None and NaN values
|
||||
list_items = []
|
||||
for item in select_region_dropdown_list:
|
||||
if item.__class__.__name__ == "str":
|
||||
list_items.append(item)
|
||||
|
||||
region_dropdown = create_region_dropdown_compliance(list_items)
|
||||
|
||||
# Select Date - Dropdown
|
||||
|
||||
date_dropdown = create_date_dropdown_compliance(
|
||||
list(data["ASSESSMENT_TIME"].unique())
|
||||
)
|
||||
|
||||
dash.register_page(__name__)
|
||||
|
||||
layout = create_layout_compliance(
|
||||
account_dropdown, date_dropdown, region_dropdown, compliance_dropdown
|
||||
)
|
||||
|
||||
|
||||
@callback(
|
||||
[
|
||||
Output("output", "children"),
|
||||
Output("overall_status_result_graph", "children"),
|
||||
Output("security_level_graph", "children"),
|
||||
Output("cloud-account-filter-compliance", "value"),
|
||||
Output("cloud-account-filter-compliance", "options"),
|
||||
Output("region-filter-compliance", "value"),
|
||||
Output("region-filter-compliance", "options"),
|
||||
Output("date-filter-analytics", "value"),
|
||||
Output("date-filter-analytics", "options"),
|
||||
],
|
||||
Input("report-compliance-filter", "value"),
|
||||
Input("cloud-account-filter-compliance", "value"),
|
||||
Input("region-filter-compliance", "value"),
|
||||
Input("date-filter-analytics", "value"),
|
||||
)
|
||||
def display_data(
|
||||
analytics_input, account_filter, region_filter_analytics, date_filter_analytics
|
||||
):
|
||||
|
||||
current_compliance = analytics_input
|
||||
analytics_input = analytics_input.replace(" - ", "_")
|
||||
analytics_input = analytics_input.lower()
|
||||
|
||||
# Check if the compliance selected is the level 1 or level 2 of the CIS
|
||||
is_level_1 = "level_1" in analytics_input
|
||||
analytics_input = analytics_input.replace("_level_1", "").replace("_level_2", "")
|
||||
|
||||
# Filter the data based on the compliance selected
|
||||
files = [file for file in csv_files if analytics_input in file]
|
||||
|
||||
def load_csv_files(files):
|
||||
"""Load CSV files into a single pandas DataFrame."""
|
||||
dfs = []
|
||||
for file in files:
|
||||
df = pd.read_csv(file, sep=";", on_bad_lines="skip")
|
||||
dfs.append(df.astype(str))
|
||||
return pd.concat(dfs, ignore_index=True)
|
||||
|
||||
data = load_csv_files(files)
|
||||
|
||||
# Rename the column LOCATION to REGION for GCP or Azure
|
||||
if "gcp" in analytics_input or "azure" in analytics_input:
|
||||
data = data.rename(columns={"LOCATION": "REGION"})
|
||||
|
||||
# Add the column ACCOUNTID to the data if the provider is kubernetes
|
||||
if "kubernetes" in analytics_input:
|
||||
data.rename(columns={"CONTEXT": "ACCOUNTID"}, inplace=True)
|
||||
data.rename(columns={"NAMESPACE": "REGION"}, inplace=True)
|
||||
if "REQUIREMENTS_ATTRIBUTES_PROFILE" in data.columns:
|
||||
data["REQUIREMENTS_ATTRIBUTES_PROFILE"] = data[
|
||||
"REQUIREMENTS_ATTRIBUTES_PROFILE"
|
||||
].apply(lambda x: x.split(" - ")[0])
|
||||
# Filter the chosen level of the CIS
|
||||
if is_level_1:
|
||||
data = data[data["REQUIREMENTS_ATTRIBUTES_PROFILE"] == "Level 1"]
|
||||
|
||||
# Rename the column PROJECTID to ACCOUNTID for GCP
|
||||
if data.columns.str.contains("PROJECTID").any():
|
||||
data.rename(columns={"PROJECTID": "ACCOUNTID"}, inplace=True)
|
||||
|
||||
# Rename the column SUBSCRIPTIONID to ACCOUNTID for Azure
|
||||
if data.columns.str.contains("SUBSCRIPTIONID").any():
|
||||
data.rename(columns={"SUBSCRIPTIONID": "ACCOUNTID"}, inplace=True)
|
||||
# Handle v3 azure cis compliance
|
||||
if data.columns.str.contains("SUBSCRIPTION").any():
|
||||
data.rename(columns={"SUBSCRIPTION": "ACCOUNTID"}, inplace=True)
|
||||
data["REGION"] = "-"
|
||||
|
||||
# Filter ACCOUNT
|
||||
if account_filter == ["All"]:
|
||||
updated_cloud_account_values = data["ACCOUNTID"].unique()
|
||||
elif "All" in account_filter and len(account_filter) > 1:
|
||||
# Remove 'All' from the list
|
||||
account_filter.remove("All")
|
||||
updated_cloud_account_values = account_filter
|
||||
elif len(account_filter) == 0:
|
||||
updated_cloud_account_values = data["ACCOUNTID"].unique()
|
||||
account_filter = ["All"]
|
||||
else:
|
||||
updated_cloud_account_values = account_filter
|
||||
|
||||
data = data[data["ACCOUNTID"].isin(updated_cloud_account_values)]
|
||||
|
||||
account_filter_options = list(data["ACCOUNTID"].unique())
|
||||
account_filter_options = account_filter_options + ["All"]
|
||||
for item in account_filter_options:
|
||||
if "nan" in item or item.__class__.__name__ != "str" or item is None:
|
||||
account_filter_options.remove(item)
|
||||
|
||||
# Filter REGION
|
||||
if region_filter_analytics == ["All"]:
|
||||
updated_region_account_values = data["REGION"].unique()
|
||||
elif "All" in region_filter_analytics and len(region_filter_analytics) > 1:
|
||||
# Remove 'All' from the list
|
||||
region_filter_analytics.remove("All")
|
||||
updated_region_account_values = region_filter_analytics
|
||||
elif len(region_filter_analytics) == 0:
|
||||
updated_region_account_values = data["REGION"].unique()
|
||||
region_filter_analytics = ["All"]
|
||||
else:
|
||||
updated_region_account_values = region_filter_analytics
|
||||
|
||||
data = data[data["REGION"].isin(updated_region_account_values)]
|
||||
|
||||
region_filter_options = list(data["REGION"].unique())
|
||||
region_filter_options = region_filter_options + ["All"]
|
||||
for item in region_filter_options:
|
||||
if item == "nan" or item.__class__.__name__ != "str":
|
||||
region_filter_options.remove(item)
|
||||
|
||||
data["ASSESSMENTDATE"] = pd.to_datetime(data["ASSESSMENTDATE"], errors="coerce")
|
||||
data["ASSESSMENTDATE"] = data["ASSESSMENTDATE"].dt.strftime("%Y-%m-%d %H:%M:%S")
|
||||
|
||||
# Choosing the date that is the most recent
|
||||
data_values = data["ASSESSMENTDATE"].unique()
|
||||
data_values.sort()
|
||||
data_values = data_values[::-1]
|
||||
aux = []
|
||||
|
||||
data_values = [str(i) for i in data_values]
|
||||
for value in data_values:
|
||||
if value.split(" ")[0] not in [aux[i].split(" ")[0] for i in range(len(aux))]:
|
||||
aux.append(value)
|
||||
data_values = [str(i) for i in aux]
|
||||
|
||||
data = data[data["ASSESSMENTDATE"].isin(data_values)]
|
||||
data["ASSESSMENTDATE"] = data["ASSESSMENTDATE"].apply(lambda x: x.split(" ")[0])
|
||||
|
||||
options_date = data["ASSESSMENTDATE"].unique()
|
||||
options_date.sort()
|
||||
options_date = options_date[::-1]
|
||||
|
||||
# Filter DATE
|
||||
if date_filter_analytics in options_date:
|
||||
data = data[data["ASSESSMENTDATE"] == date_filter_analytics]
|
||||
else:
|
||||
date_filter_analytics = options_date[0]
|
||||
data = data[data["ASSESSMENTDATE"] == date_filter_analytics]
|
||||
|
||||
if data.empty:
|
||||
fig = px.pie()
|
||||
pie_1 = dcc.Graph(
|
||||
figure=fig,
|
||||
config={"displayModeBar": False},
|
||||
style={"height": "250px", "width": "250px", "right": "0px"},
|
||||
)
|
||||
|
||||
return [
|
||||
html.Div(
|
||||
[
|
||||
html.H5(
|
||||
"No data found for this compliance",
|
||||
className="card-title",
|
||||
style={"text-align": "left"},
|
||||
)
|
||||
],
|
||||
style={
|
||||
"width": "99%",
|
||||
"margin-right": "0.8%",
|
||||
"margin-bottom": "10px",
|
||||
},
|
||||
)
|
||||
]
|
||||
else:
|
||||
# Check cases where the compliance start with AWS_
|
||||
if "aws_" in analytics_input:
|
||||
analytics_input = analytics_input + "_aws"
|
||||
try:
|
||||
current = analytics_input.replace(".", "_")
|
||||
compliance_module = importlib.import_module(
|
||||
f"dashboard.compliance.{current}"
|
||||
)
|
||||
data.drop_duplicates(keep="first", inplace=True)
|
||||
table = compliance_module.get_table(data)
|
||||
except ModuleNotFoundError:
|
||||
table = html.Div(
|
||||
[
|
||||
html.H5(
|
||||
"No data found for this compliance",
|
||||
className="card-title",
|
||||
style={"text-align": "left", "color": "black"},
|
||||
)
|
||||
],
|
||||
style={
|
||||
"width": "99%",
|
||||
"margin-right": "0.8%",
|
||||
"margin-bottom": "10px",
|
||||
},
|
||||
)
|
||||
|
||||
df = data.copy()
|
||||
df = df.groupby(["STATUS"]).size().reset_index(name="counts")
|
||||
df = df.sort_values(by=["counts"], ascending=False)
|
||||
|
||||
# Pie 1
|
||||
pie_1 = get_pie(df)
|
||||
|
||||
# Get the pie2 depending on the compliance
|
||||
df = data.copy()
|
||||
|
||||
current_filter = ""
|
||||
|
||||
if "pci" in analytics_input:
|
||||
pie_2 = get_bar_graph(df, "REQUIREMENTS_ID")
|
||||
current_filter = "req_id"
|
||||
elif (
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION" in df.columns
|
||||
and not df["REQUIREMENTS_ATTRIBUTES_SECTION"].isnull().values.any()
|
||||
):
|
||||
pie_2 = get_bar_graph(df, "REQUIREMENTS_ATTRIBUTES_SECTION")
|
||||
current_filter = "sections"
|
||||
elif (
|
||||
"REQUIREMENTS_ATTRIBUTES_CATEGORIA" in df.columns
|
||||
and not df["REQUIREMENTS_ATTRIBUTES_CATEGORIA"].isnull().values.any()
|
||||
):
|
||||
pie_2 = get_bar_graph(df, "REQUIREMENTS_ATTRIBUTES_CATEGORIA")
|
||||
current_filter = "categorias"
|
||||
elif (
|
||||
"REQUIREMENTS_ATTRIBUTES_CATEGORY" in df.columns
|
||||
and not df["REQUIREMENTS_ATTRIBUTES_CATEGORY"].isnull().values.any()
|
||||
):
|
||||
pie_2 = get_bar_graph(df, "REQUIREMENTS_ATTRIBUTES_CATEGORY")
|
||||
current_filter = "categories"
|
||||
elif (
|
||||
"REQUIREMENTS_ATTRIBUTES_SERVICE" in df.columns
|
||||
and not df["REQUIREMENTS_ATTRIBUTES_SERVICE"].isnull().values.any()
|
||||
):
|
||||
pie_2 = get_bar_graph(df, "REQUIREMENTS_ATTRIBUTES_SERVICE")
|
||||
current_filter = "services"
|
||||
else:
|
||||
fig = px.pie()
|
||||
fig.update_layout(
|
||||
margin=dict(l=0, r=0, t=0, b=0),
|
||||
autosize=True,
|
||||
showlegend=False,
|
||||
paper_bgcolor="#303030",
|
||||
)
|
||||
pie_2 = dcc.Graph(
|
||||
figure=fig,
|
||||
config={"displayModeBar": False},
|
||||
style={"height": "250px", "width": "250px", "right": "0px"},
|
||||
)
|
||||
current_filter = "none"
|
||||
|
||||
# Analytics table
|
||||
|
||||
if not analytics_input:
|
||||
analytics_input = ""
|
||||
|
||||
table_output = get_table(current_compliance, table)
|
||||
|
||||
overall_status_result_graph = get_graph(pie_1, "Overall Status Result")
|
||||
|
||||
security_level_graph = get_graph(
|
||||
pie_2, f"Top 5 failed {current_filter} by findings"
|
||||
)
|
||||
|
||||
return (
|
||||
table_output,
|
||||
overall_status_result_graph,
|
||||
security_level_graph,
|
||||
account_filter,
|
||||
account_filter_options,
|
||||
region_filter_analytics,
|
||||
region_filter_options,
|
||||
date_filter_analytics,
|
||||
options_date,
|
||||
)
|
||||
|
||||
|
||||
def get_graph(pie, title):
|
||||
return [
|
||||
html.Span(
|
||||
title,
|
||||
className="text-center text-prowler-stone-900 uppercase text-xs font-bold",
|
||||
),
|
||||
html.Div(
|
||||
[pie],
|
||||
className="",
|
||||
style={
|
||||
"display": "flex",
|
||||
"justify-content": "center",
|
||||
"align-items": "center",
|
||||
"margin-top": "7%",
|
||||
},
|
||||
),
|
||||
]
|
||||
|
||||
|
||||
def get_bar_graph(df, column_name):
|
||||
df = df[df["STATUS"] == "FAIL"]
|
||||
df = df.groupby([column_name, "STATUS"]).size().reset_index(name="counts")
|
||||
df = df.sort_values(by=["counts"], ascending=True)
|
||||
# take the top 5
|
||||
df = df.tail(5)
|
||||
|
||||
colums = df[column_name].unique()
|
||||
|
||||
# Cut the text if it is too long
|
||||
for i in range(len(colums)):
|
||||
if len(colums[i]) > 15:
|
||||
colums[i] = colums[i][:15] + "..."
|
||||
|
||||
fig = px.bar(
|
||||
df,
|
||||
x="counts",
|
||||
y=colums,
|
||||
color="STATUS",
|
||||
color_discrete_map={"FAIL": fail_color},
|
||||
orientation="h",
|
||||
)
|
||||
|
||||
fig.update_layout(
|
||||
margin=dict(l=0, r=0, t=0, b=0),
|
||||
autosize=True,
|
||||
showlegend=False,
|
||||
xaxis_title=None,
|
||||
yaxis_title=None,
|
||||
font=dict(size=14, color="#292524"),
|
||||
hoverlabel=dict(font_size=12),
|
||||
paper_bgcolor="#FFF",
|
||||
)
|
||||
|
||||
return dcc.Graph(
|
||||
figure=fig,
|
||||
config={"displayModeBar": False},
|
||||
style={"height": "20rem", "width": "40rem"},
|
||||
)
|
||||
|
||||
|
||||
def get_pie(df):
|
||||
# Define custom colors
|
||||
color_mapping = {
|
||||
"FAIL": fail_color,
|
||||
"PASS": pass_color,
|
||||
"INFO": info_color,
|
||||
"WARN": "#260000",
|
||||
"MANUAL": manual_color,
|
||||
}
|
||||
|
||||
# Use the color_discrete_map parameter to map categories to custom colors
|
||||
fig = px.pie(
|
||||
df,
|
||||
names="STATUS",
|
||||
values="counts",
|
||||
hole=0.7,
|
||||
color="STATUS",
|
||||
color_discrete_map=color_mapping,
|
||||
)
|
||||
fig.update_traces(
|
||||
hovertemplate=None,
|
||||
textposition="outside",
|
||||
textinfo="percent+label",
|
||||
rotation=50,
|
||||
)
|
||||
|
||||
fig.update_layout(
|
||||
margin=dict(l=0, r=0, t=0, b=0),
|
||||
autosize=True,
|
||||
showlegend=False,
|
||||
font=dict(size=14, color="#292524"),
|
||||
hoverlabel=dict(font_size=12),
|
||||
paper_bgcolor="#FFF",
|
||||
)
|
||||
|
||||
pie = dcc.Graph(
|
||||
figure=fig,
|
||||
config={"displayModeBar": False},
|
||||
style={"height": "20rem", "width": "20rem"},
|
||||
)
|
||||
|
||||
return pie
|
||||
|
||||
|
||||
def get_table(current_compliance, table):
|
||||
return [
|
||||
html.Div(
|
||||
[
|
||||
html.H5(
|
||||
f"{current_compliance}",
|
||||
className="text-prowler-stone-900 text-md font-bold uppercase mb-4",
|
||||
),
|
||||
table,
|
||||
],
|
||||
className="relative flex flex-col bg-white shadow-provider rounded-xl px-4 py-3 flex-wrap w-full",
|
||||
),
|
||||
]
|
||||
1074
dashboard/pages/overview.py
Normal file
179
dashboard/src/input.css
Normal file
@@ -0,0 +1,179 @@
|
||||
/*
|
||||
/*
|
||||
/*
|
||||
/*
|
||||
/* Use this file to add custom styles using Tailwind's utility classes. */
|
||||
|
||||
@tailwind base;
|
||||
@tailwind components;
|
||||
@tailwind utilities;
|
||||
|
||||
#_dash-app-content {
|
||||
@apply bg-prowler-stone-500;
|
||||
}
|
||||
|
||||
@layer components {
|
||||
.custom-grid {
|
||||
grid-template-columns: minmax(0, 16fr) repeat(11, minmax(0, 11fr));
|
||||
}
|
||||
|
||||
.custom-grid-large {
|
||||
grid-template-columns: minmax(0, 10fr) repeat(11, minmax(0, 11fr));
|
||||
}
|
||||
|
||||
/* Styles for the table in the overview page */
|
||||
.table-overview thead {
|
||||
display: table;
|
||||
width: 100%;
|
||||
table-layout: fixed;
|
||||
}
|
||||
|
||||
.table-overview tbody {
|
||||
-ms-overflow-style: none; /* IE and Edge */
|
||||
scrollbar-width: none; /* Firefox */
|
||||
}
|
||||
|
||||
.table-overview tbody tr {
|
||||
display: table;
|
||||
width: 100%;
|
||||
table-layout: fixed;
|
||||
}
|
||||
/* Styles for thead */
|
||||
.table-overview th {
|
||||
@apply bg-prowler-stone-900 text-sm py-3 font-bold;
|
||||
}
|
||||
|
||||
.table-overview td {
|
||||
@apply text-prowler-stone-900 bg-prowler-white text-sm py-2 font-bold;
|
||||
}
|
||||
|
||||
/* Check ID */
|
||||
.table-overview td:nth-child(1),
|
||||
.table-overview th:nth-child(1) {
|
||||
@apply w-[52%];
|
||||
}
|
||||
/* Severity */
|
||||
.table-overview td:nth-child(2),
|
||||
.table-overview th:nth-child(2) {
|
||||
@apply w-[8%] capitalize;
|
||||
}
|
||||
/* Status */
|
||||
.table-overview td:nth-child(3),
|
||||
.table-overview th:nth-child(3) {
|
||||
@apply w-[7%];
|
||||
}
|
||||
.table-overview td:nth-child(3) {
|
||||
@apply font-bold text-prowler-error;
|
||||
}
|
||||
/* Region */
|
||||
.table-overview td:nth-child(4),
|
||||
.table-overview th:nth-child(4) {
|
||||
@apply w-[9%];
|
||||
}
|
||||
/* Service */
|
||||
.table-overview td:nth-child(5),
|
||||
.table-overview th:nth-child(5) {
|
||||
@apply w-[6%];
|
||||
}
|
||||
/* Provider */
|
||||
.table-overview td:nth-child(6),
|
||||
.table-overview th:nth-child(6) {
|
||||
@apply w-[7%];
|
||||
}
|
||||
/* Account ID */
|
||||
.table-overview td:nth-child(7),
|
||||
.table-overview th:nth-child(7) {
|
||||
@apply w-[11%];
|
||||
}
|
||||
}
|
||||
|
||||
/* Styles for the accordion in the compliance page */
|
||||
#_dash-app-content .accordion .accordion-header .accordion-button {
|
||||
@apply text-prowler-stone-900 inline-block px-4 text-xs font-bold uppercase transition-all rounded-lg bg-prowler-stone-300 hover:bg-prowler-stone-900/10;
|
||||
}
|
||||
|
||||
#_dash-app-content .accordion .accordion-item {
|
||||
@apply text-prowler-stone-900 bg-prowler-white rounded-lg;
|
||||
}
|
||||
|
||||
#_dash-app-content .accordion .accordion-button:not(.collapsed) {
|
||||
@apply text-prowler-stone-900 bg-prowler-stone-500;
|
||||
}
|
||||
|
||||
#_dash-app-content .accordion .dash-table-container {
|
||||
@apply grid;
|
||||
}
|
||||
|
||||
#_dash-app-content .accordion table {
|
||||
@apply rounded-lg;
|
||||
}
|
||||
/* Styles for thead */
|
||||
#_dash-app-content .accordion th {
|
||||
@apply text-prowler-white text-left bg-prowler-stone-900 text-xs py-1 font-bold;
|
||||
}
|
||||
|
||||
/* Styles for td */
|
||||
#_dash-app-content .accordion td {
|
||||
@apply text-prowler-stone-900 text-left bg-prowler-white text-xs py-1 font-light;
|
||||
}
|
||||
|
||||
/* Styles for table cells */
|
||||
#_dash-app-content .accordion table tbody thead,
|
||||
#_dash-app-content .accordion table tbody tr {
|
||||
@apply w-full;
|
||||
}
|
||||
|
||||
/* Check ID */
|
||||
#_dash-app-content .accordion table th:nth-child(1) {
|
||||
@apply w-[60%];
|
||||
}
|
||||
/* Status */
|
||||
#_dash-app-content .accordion table th:nth-child(2) {
|
||||
@apply w-[10%] text-center;
|
||||
}
|
||||
#_dash-app-content .accordion table td:nth-child(2) {
|
||||
@apply text-center;
|
||||
}
|
||||
/* Region */
|
||||
#_dash-app-content .accordion table th:nth-child(3) {
|
||||
@apply w-[10%];
|
||||
}
|
||||
/* Account ID */
|
||||
#_dash-app-content .accordion table th:nth-child(4) {
|
||||
@apply w-[10%];
|
||||
}
|
||||
/* Resource ID */
|
||||
#_dash-app-content .accordion table th:nth-child(5) {
|
||||
@apply w-[10%];
|
||||
}
|
||||
|
||||
#_dash-app-content .compliance-data-layout,
|
||||
#_dash-app-content .accordion-body,
|
||||
#_dash-app-content .compliance-data-layout .accordion.accordion-flush {
|
||||
@apply grid gap-y-4;
|
||||
}
|
||||
|
||||
#_dash-app-content .accordion-inner--child,
|
||||
#_dash-app-content .accordion-inner {
|
||||
@apply relative;
|
||||
}
|
||||
|
||||
#_dash-app-content .info-bar {
|
||||
@apply absolute left-1/2 transform -translate-x-1/2 top-2 h-8 z-50;
|
||||
}
|
||||
|
||||
#_dash-app-content .info-bar-child {
|
||||
@apply absolute right-6 top-2 w-auto h-8 z-50;
|
||||
}
|
||||
|
||||
@layer utilities {
|
||||
/* Hide scrollbar for Chrome, Safari and Opera */
|
||||
.no-scrollbar::-webkit-scrollbar {
|
||||
display: none;
|
||||
}
|
||||
/* Hide scrollbar for IE, Edge and Firefox */
|
||||
.no-scrollbar {
|
||||
-ms-overflow-style: none; /* IE and Edge */
|
||||
scrollbar-width: none; /* Firefox */
|
||||
}
|
||||
}
|
||||
90
dashboard/tailwind.config.js
Normal file
@@ -0,0 +1,90 @@
|
||||
/** @type {import('tailwindcss').Config} */
|
||||
module.exports = {
|
||||
content: [
|
||||
"./assets/**/*.{py,html,js}",
|
||||
"./components/**/*.{py,html,js}",
|
||||
"./pages/**/*.{py,html,js}",
|
||||
"./utils/**/*.{py,html,js}",
|
||||
"./app.py",
|
||||
],
|
||||
theme: {
|
||||
extend: {
|
||||
colors: {
|
||||
prowler: {
|
||||
stone: {
|
||||
950: "#1C1917",
|
||||
900: "#292524",
|
||||
500: "#E7E5E4",
|
||||
300: "#F5F5F4",
|
||||
},
|
||||
gray: {
|
||||
900: "#9bAACF",
|
||||
700: "#BEC8E4",
|
||||
500: "#C8D0E7",
|
||||
300: "#E4EBF5",
|
||||
},
|
||||
status: {
|
||||
passed: "#1FB53F",
|
||||
failed: "#A3231F",
|
||||
},
|
||||
lime: "#84CC16",
|
||||
white: "#FFFFFF",
|
||||
error: "#B91C1C",
|
||||
},
|
||||
},
|
||||
fontSize: {
|
||||
'3xs': '0.625rem', // 10px
|
||||
'2xs': '0.6875rem', // 11px
|
||||
xs: '0.75rem', // 12px
|
||||
sm: '0.875rem', // 14px
|
||||
base: '1rem', // 16px
|
||||
lg: '1.125rem', // 18px
|
||||
xl: '1.25rem', // 20px
|
||||
'2xl': '1.375rem', // 22px
|
||||
'2xxl': '1.5rem', // 24px
|
||||
'3xl': '1.75rem', // 28px
|
||||
'4xl': '2rem', // 32px
|
||||
'5xl': '2.25rem', // 36px
|
||||
'6xl': '2.75rem', // 44px
|
||||
'7xl': '3.5rem' // 56px
|
||||
},
|
||||
fontWeight: {
|
||||
light: 300,
|
||||
regular: 400,
|
||||
medium: 500,
|
||||
bold: 700,
|
||||
heavy: 800
|
||||
},
|
||||
lineHeight: {
|
||||
14: "0.875rem", // 14px
|
||||
22: "1.375rem", // 22px
|
||||
26: "1.625rem", // 26px
|
||||
28: "1.75rem", // 28px
|
||||
30: "1.875rem", // 30px
|
||||
32: "2rem", // 32px
|
||||
34: "2.125rem", // 34px
|
||||
36: "2.25rem", // 36px
|
||||
40: "2.5rem", // 40px
|
||||
44: "2.75rem", // 44px
|
||||
48: "3rem", // 48px
|
||||
56: "3.5rem", // 56px
|
||||
68: "4.25rem", // 68px
|
||||
},
|
||||
boxShadow: {
|
||||
"provider":
|
||||
".3rem .3rem .6rem #c8d0e7, -.2rem -.2rem .5rem #FFF",
|
||||
"box-up":
|
||||
"0.3rem 0.3rem 0.6rem #c8d0e7, -0.2rem -0.2rem 0.5rem #FFF",
|
||||
"box-down":
|
||||
"inset .2rem .2rem .5rem #c8d0e7, inset -.2rem -.2rem .5rem #FFF",
|
||||
},
|
||||
backgroundImage: {
|
||||
"gradient-passed":
|
||||
"linear-gradient(127.43deg, #F1F5F8 -177.68%, #4ADE80 87.35%)",
|
||||
"gradient-failed":
|
||||
"linear-gradient(127.43deg, #F1F5F8 -177.68%, #EF4444 87.35%)",
|
||||
},
|
||||
},
|
||||
},
|
||||
plugins: [],
|
||||
};
|
||||
9
docs/developer-guide/audit-info.md
Normal file
@@ -0,0 +1,9 @@
|
||||
# Audit Info
|
||||
|
||||
In each Prowler provider we have a Python object called `audit_info` which is in charge of keeping the credentials, the configuration and the state of each audit, and it's passed to each service during the `__init__`.
|
||||
|
||||
- AWS: https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/aws/lib/audit_info/models.py#L34-L54
|
||||
- GCP: https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/aws/lib/audit_info/models.py#L7-L30
|
||||
- Azure: https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/azure/lib/audit_info/models.py#L17-L31
|
||||
|
||||
This `audit_info` object is shared during the Prowler execution and for that reason is important to mock it in each test to isolate them. See the [testing guide](./unit-testing.md) for more information.
|
||||
319
docs/developer-guide/checks.md
Normal file
@@ -0,0 +1,319 @@
|
||||
# Create a new Check for a Provider
|
||||
|
||||
Here you can find how to create new checks for Prowler.
|
||||
|
||||
**To create a check is required to have a Prowler provider service already created, so if the service is not present or the attribute you want to audit is not retrieved by the service, please refer to the [Service](./services.md) documentation.**
|
||||
|
||||
## Introduction
|
||||
To create a new check for a supported Prowler provider, you will need to create a folder with the check name inside the specific service for the selected provider.
|
||||
|
||||
We are going to use the `ec2_ami_public` check form the `AWS` provider as an example. So the folder name will `prowler/providers/aws/services/ec2/ec2_ami_public` (following the format `prowler/providers/<provider>/services/<service>/<check_name>`), with the name of check following the pattern: `service_subservice/resource_action`.
|
||||
|
||||
Inside that folder, we need to create three files:
|
||||
|
||||
- An empty `__init__.py`: to make Python treat this check folder as a package.
|
||||
- A `check_name.py` with the above format containing the check's logic. Refer to the [check](./checks.md#check)
|
||||
- A `check_name.metadata.json` containing the check's metadata. Refer to the [check metadata](./checks.md#check-metadata)
|
||||
|
||||
## Check
|
||||
|
||||
The Prowler's check structure is very simple and following it there is nothing more to do to include a check in a provider's service because the load is done dynamically based on the paths.
|
||||
|
||||
The following is the code for the `ec2_ami_public` check:
|
||||
```python title="Check Class"
|
||||
# At the top of the file we need to import the following:
|
||||
# - Check class which is in charge of the following:
|
||||
# - Retrieve the check metadata and expose the `metadata()`
|
||||
# to return a JSON representation of the metadata,
|
||||
# read more at Check Metadata Model down below.
|
||||
# - Enforce that each check requires to have the `execute()` function
|
||||
from prowler.lib.check.models import Check, Check_Report_AWS
|
||||
|
||||
# Then you have to import the provider service client
|
||||
# read more at the Service documentation.
|
||||
from prowler.providers.aws.services.ec2.ec2_client import ec2_client
|
||||
|
||||
# For each check we need to create a python class called the same as the
|
||||
# file which inherits from the Check class.
|
||||
class ec2_ami_public(Check):
|
||||
"""ec2_ami_public verifies if an EC2 AMI is publicly shared"""
|
||||
|
||||
# Then, within the check's class we need to create the "execute(self)"
|
||||
# function, which is enforce by the "Check" class to implement
|
||||
# the Check's interface and let Prowler to run this check.
|
||||
def execute(self):
|
||||
|
||||
# Inside the execute(self) function we need to create
|
||||
# the list of findings initialised to an empty list []
|
||||
findings = []
|
||||
|
||||
# Then, using the service client we need to iterate by the resource we
|
||||
# want to check, in this case EC2 AMIs stored in the
|
||||
# "ec2_client.images" object.
|
||||
for image in ec2_client.images:
|
||||
|
||||
# Once iterating for the images, we have to intialise
|
||||
# the Check_Report_AWS class passing the check's metadata
|
||||
# using the "metadata" function explained above.
|
||||
report = Check_Report_AWS(self.metadata())
|
||||
|
||||
# For each Prowler check we MUST fill the following
|
||||
# Check_Report_AWS fields:
|
||||
# - region
|
||||
# - resource_id
|
||||
# - resource_arn
|
||||
# - resource_tags
|
||||
# - status
|
||||
# - status_extended
|
||||
report.region = image.region
|
||||
report.resource_id = image.id
|
||||
report.resource_arn = image.arn
|
||||
# The resource_tags should be filled if the resource has the ability
|
||||
# of having tags, please check the service first.
|
||||
report.resource_tags = image.tags
|
||||
|
||||
# Then we need to create the business logic for the check
|
||||
# which always should be simple because the Prowler service
|
||||
# must do the heavy lifting and the check should be in charge
|
||||
# of parsing the data provided
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"EC2 AMI {image.id} is not public."
|
||||
|
||||
# In this example each "image" object has a boolean attribute
|
||||
# called "public" to set if the AMI is publicly shared
|
||||
if image.public:
|
||||
report.status = "FAIL"
|
||||
report.status_extended = (
|
||||
f"EC2 AMI {image.id} is currently public."
|
||||
)
|
||||
|
||||
# Then at the same level as the "report"
|
||||
# object we need to append it to the findings list.
|
||||
findings.append(report)
|
||||
|
||||
# Last thing to do is to return the findings list to Prowler
|
||||
return findings
|
||||
```
|
||||
|
||||
### Check Status
|
||||
|
||||
All the checks MUST fill the `report.status` and `report.status_extended` with the following criteria:
|
||||
|
||||
- Status -- `report.status`
|
||||
- `PASS` --> If the check is passing against the configured value.
|
||||
- `FAIL` --> If the check is passing against the configured value.
|
||||
- `MANUAL` --> This value cannot be used unless a manual operation is required in order to determine if the `report.status` is whether `PASS` or `FAIL`.
|
||||
- Status Extended -- `report.status_extended`
|
||||
- MUST end in a dot `.`
|
||||
- MUST include the service audited with the resource and a brief explanation of the result generated, e.g.: `EC2 AMI ami-0123456789 is not public.`
|
||||
|
||||
### Check Region
|
||||
|
||||
All the checks MUST fill the `report.region` with the following criteria:
|
||||
|
||||
- If the audited resource is regional use the `region` attribute within the resource object.
|
||||
- If the audited resource is global use the `service_client.region` within the service client object.
|
||||
|
||||
### Resource ID, Name and ARN
|
||||
All the checks MUST fill the `report.resource_id` and `report.resource_arn` with the following criteria:
|
||||
|
||||
- AWS
|
||||
- Resource ID -- `report.resource_id`
|
||||
- AWS Account --> Account Number `123456789012`
|
||||
- AWS Resource --> Resource ID / Name
|
||||
- Root resource --> `<root_account>`
|
||||
- Resource ARN -- `report.resource_arn`
|
||||
- AWS Account --> Root ARN `arn:aws:iam::123456789012:root`
|
||||
- AWS Resource --> Resource ARN
|
||||
- Root resource --> Resource Type ARN `f"arn:{service_client.audited_partition}:<service_name>:{service_client.region}:{service_client.audited_account}:<resource_type>"`
|
||||
- GCP
|
||||
- Resource ID -- `report.resource_id`
|
||||
- GCP Resource --> Resource ID
|
||||
- Resource Name -- `report.resource_name`
|
||||
- GCP Resource --> Resource Name
|
||||
- Azure
|
||||
- Resource ID -- `report.resource_id`
|
||||
- Azure Resource --> Resource ID
|
||||
- Resource Name -- `report.resource_name`
|
||||
- Azure Resource --> Resource Name
|
||||
|
||||
### Python Model
|
||||
The following is the Python model for the check's class.
|
||||
|
||||
As per August 5th 2023 the `Check_Metadata_Model` can be found [here](https://github.com/prowler-cloud/prowler/blob/master/prowler/lib/check/models.py#L59-L80).
|
||||
|
||||
```python
|
||||
class Check(ABC, Check_Metadata_Model):
|
||||
"""Prowler Check"""
|
||||
|
||||
def __init__(self, **data):
|
||||
"""Check's init function. Calls the CheckMetadataModel init."""
|
||||
# Parse the Check's metadata file
|
||||
metadata_file = (
|
||||
os.path.abspath(sys.modules[self.__module__].__file__)[:-3]
|
||||
+ ".metadata.json"
|
||||
)
|
||||
# Store it to validate them with Pydantic
|
||||
data = Check_Metadata_Model.parse_file(metadata_file).dict()
|
||||
# Calls parents init function
|
||||
super().__init__(**data)
|
||||
|
||||
def metadata(self) -> dict:
|
||||
"""Return the JSON representation of the check's metadata"""
|
||||
return self.json()
|
||||
|
||||
@abstractmethod
|
||||
def execute(self):
|
||||
"""Execute the check's logic"""
|
||||
```
|
||||
|
||||
### Using the audit config
|
||||
|
||||
Prowler has a [configuration file](../tutorials/configuration_file.md) which is used to pass certain configuration values to the checks, like the following:
|
||||
|
||||
```python title="ec2_securitygroup_with_many_ingress_egress_rules.py"
|
||||
class ec2_securitygroup_with_many_ingress_egress_rules(Check):
|
||||
def execute(self):
|
||||
findings = []
|
||||
|
||||
# max_security_group_rules, default: 50
|
||||
max_security_group_rules = ec2_client.audit_config.get(
|
||||
"max_security_group_rules", 50
|
||||
)
|
||||
for security_group in ec2_client.security_groups:
|
||||
```
|
||||
|
||||
```yaml title="config.yaml"
|
||||
# AWS Configuration
|
||||
aws:
|
||||
# AWS EC2 Configuration
|
||||
|
||||
# aws.ec2_securitygroup_with_many_ingress_egress_rules
|
||||
# The default value is 50 rules
|
||||
max_security_group_rules: 50
|
||||
```
|
||||
|
||||
As you can see in the above code, within the service client, in this case the `ec2_client`, there is an object called `audit_config` which is a Python dictionary containing the values read from the configuration file.
|
||||
|
||||
In order to use it, you have to check first if the value is present in the configuration file. If the value is not present, you can create it in the `config.yaml` file and then, read it from the check.
|
||||
|
||||
???+ note
|
||||
It is mandatory to always use the `dictionary.get(value, default)` syntax to set a default value in the case the configuration value is not present.
|
||||
|
||||
|
||||
## Check Metadata
|
||||
|
||||
Each Prowler check has metadata associated which is stored at the same level of the check's folder in a file called A `check_name.metadata.json` containing the check's metadata.
|
||||
|
||||
???+ note
|
||||
We are going to include comments in this example metadata JSON but they cannot be included because the JSON format does not allow comments.
|
||||
|
||||
```json
|
||||
{
|
||||
# Provider holds the Prowler provider which the checks belongs to
|
||||
"Provider": "aws",
|
||||
# CheckID holds check name
|
||||
"CheckID": "ec2_ami_public",
|
||||
# CheckTitle holds the title of the check
|
||||
"CheckTitle": "Ensure there are no EC2 AMIs set as Public.",
|
||||
# CheckType holds Software and Configuration Checks, check more here
|
||||
# https://docs.aws.amazon.com/securityhub/latest/userguide/asff-required-attributes.html#Types
|
||||
"CheckType": [
|
||||
"Infrastructure Security"
|
||||
],
|
||||
# ServiceName holds the provider service name
|
||||
"ServiceName": "ec2",
|
||||
# SubServiceName holds the service's subservice or resource used by the check
|
||||
"SubServiceName": "ami",
|
||||
# ResourceIdTemplate holds the unique ID for the resource used by the check
|
||||
"ResourceIdTemplate": "arn:partition:service:region:account-id:resource-id",
|
||||
# Severity holds the check's severity, always in lowercase (critical, high, medium, low or informational)
|
||||
"Severity": "critical",
|
||||
# ResourceType only for AWS, holds the type from here
|
||||
# https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-template-resource-type-ref.html
|
||||
"ResourceType": "Other",
|
||||
# Description holds the title of the check, for now is the same as CheckTitle
|
||||
"Description": "Ensure there are no EC2 AMIs set as Public.",
|
||||
# Risk holds the check risk if the result is FAIL
|
||||
"Risk": "When your AMIs are publicly accessible, they are available in the Community AMIs where everyone with an AWS account can use them to launch EC2 instances. Your AMIs could contain snapshots of your applications (including their data), therefore exposing your snapshots in this manner is not advised.",
|
||||
# RelatedUrl holds an URL with more information about the check purpose
|
||||
"RelatedUrl": "",
|
||||
# Remediation holds the information to help the practitioner to fix the issue in the case of the check raise a FAIL
|
||||
"Remediation": {
|
||||
# Code holds different methods to remediate the FAIL finding
|
||||
"Code": {
|
||||
# CLI holds the command in the provider native CLI to remediate it
|
||||
"CLI": "https://docs.bridgecrew.io/docs/public_8#cli-command",
|
||||
# NativeIaC holds the native IaC code to remediate it, use "https://docs.bridgecrew.io/docs"
|
||||
"NativeIaC": "",
|
||||
# Other holds the other commands, scripts or code to remediate it, use "https://www.trendmicro.com/cloudoneconformity"
|
||||
"Other": "https://docs.bridgecrew.io/docs/public_8#aws-console",
|
||||
# Terraform holds the Terraform code to remediate it, use "https://docs.bridgecrew.io/docs"
|
||||
"Terraform": ""
|
||||
},
|
||||
# Recommendation holds the recommendation for this check with a description and a related URL
|
||||
"Recommendation": {
|
||||
"Text": "We recommend your EC2 AMIs are not publicly accessible, or generally available in the Community AMIs.",
|
||||
"Url": "https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/cancel-sharing-an-AMI.html"
|
||||
}
|
||||
},
|
||||
# Categories holds the category or categories where the check can be included, if applied
|
||||
"Categories": [
|
||||
"internet-exposed"
|
||||
],
|
||||
# DependsOn is not actively used for the moment but it will hold other
|
||||
# checks wich this check is dependant to
|
||||
"DependsOn": [],
|
||||
# RelatedTo is not actively used for the moment but it will hold other
|
||||
# checks wich this check is related to
|
||||
"RelatedTo": [],
|
||||
# Notes holds additional information not covered in this file
|
||||
"Notes": ""
|
||||
}
|
||||
```
|
||||
|
||||
### Remediation Code
|
||||
|
||||
For the Remediation Code we use the following knowledge base to fill it:
|
||||
|
||||
- Official documentation for the provider
|
||||
- https://docs.bridgecrew.io
|
||||
- https://www.trendmicro.com/cloudoneconformity
|
||||
- https://github.com/cloudmatos/matos/tree/master/remediations
|
||||
|
||||
### RelatedURL and Recommendation
|
||||
|
||||
The RelatedURL field must be filled with an URL from the provider's official documentation like https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/sharingamis-intro.html
|
||||
|
||||
Also, if not present you can use the Risk and Recommendation texts from the TrendMicro [CloudConformity](https://www.trendmicro.com/cloudoneconformity) guide.
|
||||
|
||||
|
||||
### Python Model
|
||||
The following is the Python model for the check's metadata model. We use the Pydantic's [BaseModel](https://docs.pydantic.dev/latest/api/base_model/#pydantic.BaseModel) as the parent class.
|
||||
|
||||
As per August 5th 2023 the `Check_Metadata_Model` can be found [here](https://github.com/prowler-cloud/prowler/blob/master/prowler/lib/check/models.py#L34-L56).
|
||||
```python
|
||||
class Check_Metadata_Model(BaseModel):
|
||||
"""Check Metadata Model"""
|
||||
|
||||
Provider: str
|
||||
CheckID: str
|
||||
CheckTitle: str
|
||||
CheckType: list[str]
|
||||
ServiceName: str
|
||||
SubServiceName: str
|
||||
ResourceIdTemplate: str
|
||||
Severity: str
|
||||
ResourceType: str
|
||||
Description: str
|
||||
Risk: str
|
||||
RelatedUrl: str
|
||||
Remediation: Remediation
|
||||
Categories: list[str]
|
||||
DependsOn: list[str]
|
||||
RelatedTo: list[str]
|
||||
Notes: str
|
||||
# We set the compliance to None to
|
||||
# store the compliance later if supplied
|
||||
Compliance: list = None
|
||||
```
|
||||
45
docs/developer-guide/debugging.md
Normal file
@@ -0,0 +1,45 @@
|
||||
# Debugging
|
||||
|
||||
Debugging in Prowler make things easier!
|
||||
If you are developing Prowler, it's possible that you will encounter some situations where you have to inspect the code in depth to fix some unexpected issues during the execution. To do that, if you are using VSCode you can run the code using the integrated debugger. Please, refer to this [documentation](https://code.visualstudio.com/docs/editor/debugging) for guidance about the debugger in VSCode.
|
||||
The following file is an example of the [debugging configuration](https://code.visualstudio.com/docs/editor/debugging#_launch-configurations) file that you can add to [Virtual Studio Code](https://code.visualstudio.com/).
|
||||
|
||||
This file should inside the *.vscode* folder and its name has to be *launch.json*:
|
||||
|
||||
```json
|
||||
{
|
||||
"version": "0.2.0",
|
||||
"configurations": [
|
||||
{
|
||||
"name": "Python: Current File",
|
||||
"type": "python",
|
||||
"request": "launch",
|
||||
"program": "prowler.py",
|
||||
"args": [
|
||||
"aws",
|
||||
"-f",
|
||||
"eu-west-1",
|
||||
"--service",
|
||||
"cloudwatch",
|
||||
"--log-level",
|
||||
"ERROR",
|
||||
"-p",
|
||||
"dev",
|
||||
],
|
||||
"console": "integratedTerminal",
|
||||
"justMyCode": false
|
||||
},
|
||||
{
|
||||
"name": "Python: Debug Tests",
|
||||
"type": "python",
|
||||
"request": "launch",
|
||||
"program": "${file}",
|
||||
"purpose": [
|
||||
"debug-test"
|
||||
],
|
||||
"console": "integratedTerminal",
|
||||
"justMyCode": false
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
8
docs/developer-guide/documentation.md
Normal file
@@ -0,0 +1,8 @@
|
||||
## Contribute with documentation
|
||||
|
||||
We use `mkdocs` to build this Prowler documentation site so you can easily contribute back with new docs or improving them. To install all necessary dependencies use `poetry install --with docs`.
|
||||
|
||||
1. Install `mkdocs` with your favorite package manager.
|
||||
2. Inside the `prowler` repository folder run `mkdocs serve` and point your browser to `http://localhost:8000` and you will see live changes to your local copy of this documentation site.
|
||||
3. Make all needed changes to docs or add new documents. To do so just edit existing md files inside `prowler/docs` and if you are adding a new section or file please make sure you add it to `mkdocs.yaml` file in the root folder of the Prowler repo.
|
||||
4. Once you are done with changes, please send a pull request to us for review and merge. Thank you in advance!
|
||||
3
docs/developer-guide/integration-testing.md
Normal file
@@ -0,0 +1,3 @@
|
||||
# Integration Tests
|
||||
|
||||
Coming soon ...
|
||||
3
docs/developer-guide/integrations.md
Normal file
@@ -0,0 +1,3 @@
|
||||
# Create a new integration
|
||||
|
||||
Coming soon ...
|
||||
61
docs/developer-guide/introduction.md
Normal file
@@ -0,0 +1,61 @@
|
||||
# Developer Guide
|
||||
|
||||
You can extend Prowler Open Source in many different ways, in most cases you will want to create your own checks and compliance security frameworks, here is where you can learn about how to get started with it. We also include how to create custom outputs, integrations and more.
|
||||
|
||||
## Get the code and install all dependencies
|
||||
|
||||
First of all, you need a version of Python 3.9 or higher and also pip installed to be able to install all dependencies required. Once that is satisfied go a head and clone the repo:
|
||||
|
||||
```
|
||||
git clone https://github.com/prowler-cloud/prowler
|
||||
cd prowler
|
||||
```
|
||||
For isolation and avoid conflicts with other environments, we recommend usage of `poetry`:
|
||||
```
|
||||
pip install poetry
|
||||
```
|
||||
Then install all dependencies including the ones for developers:
|
||||
```
|
||||
poetry install --with dev
|
||||
poetry shell
|
||||
```
|
||||
|
||||
## Contributing with your code or fixes to Prowler
|
||||
|
||||
This repo has git pre-commit hooks managed via the [pre-commit](https://pre-commit.com/) tool. [Install](https://pre-commit.com/#install) it how ever you like, then in the root of this repo run:
|
||||
```shell
|
||||
pre-commit install
|
||||
```
|
||||
You should get an output like the following:
|
||||
```shell
|
||||
pre-commit installed at .git/hooks/pre-commit
|
||||
```
|
||||
|
||||
Before we merge any of your pull requests we pass checks to the code, we use the following tools and automation to make sure the code is secure and dependencies up-to-dated:
|
||||
???+ note
|
||||
These should have been already installed if you ran `poetry install --with dev`
|
||||
|
||||
- [`bandit`](https://pypi.org/project/bandit/) for code security review.
|
||||
- [`safety`](https://pypi.org/project/safety/) and [`dependabot`](https://github.com/features/security) for dependencies.
|
||||
- [`hadolint`](https://github.com/hadolint/hadolint) and [`dockle`](https://github.com/goodwithtech/dockle) for our containers security.
|
||||
- [`Snyk`](https://docs.snyk.io/integrations/snyk-container-integrations/container-security-with-docker-hub-integration) in Docker Hub.
|
||||
- [`clair`](https://github.com/quay/clair) in Amazon ECR.
|
||||
- [`vulture`](https://pypi.org/project/vulture/), [`flake8`](https://pypi.org/project/flake8/), [`black`](https://pypi.org/project/black/) and [`pylint`](https://pypi.org/project/pylint/) for formatting and best practices.
|
||||
|
||||
You can see all dependencies in file `pyproject.toml`.
|
||||
|
||||
## Pull Request Checklist
|
||||
|
||||
If you create or review a PR in https://github.com/prowler-cloud/prowler please follow this checklist:
|
||||
|
||||
- [ ] Make sure you've read the Prowler Developer Guide at https://docs.prowler.cloud/en/latest/developer-guide/introduction/
|
||||
- [ ] Are we following the style guide, hence installed all the linters and formatters? Please check https://docs.prowler.cloud/en/latest/developer-guide/introduction/#contributing-with-your-code-or-fixes-to-prowler
|
||||
- [ ] Are we increasing/decreasing the test coverage? Please, review if we need to include/modify tests for the new code.
|
||||
- [ ] Are we modifying outputs? Please review it carefully.
|
||||
- [ ] Do we need to modify the Prowler documentation to reflect the changes introduced?
|
||||
- [ ] Are we introducing possible breaking changes? Are we modifying a core feature?
|
||||
|
||||
|
||||
## Want some swag as appreciation for your contribution?
|
||||
|
||||
If you are like us and you love swag, we are happy to thank you for your contribution with some laptop stickers or whatever other swag we may have at that time. Please, tell us more details and your pull request link in our [Slack workspace here](https://join.slack.com/t/prowler-workspace/shared_invite/zt-1hix76xsl-2uq222JIXrC7Q8It~9ZNog). You can also reach out to Toni de la Fuente on Twitter [here](https://twitter.com/ToniBlyx), his DMs are open.
|
||||
3
docs/developer-guide/outputs.md
Normal file
@@ -0,0 +1,3 @@
|
||||
# Create a custom output format
|
||||
|
||||
Coming soon ...
|
||||
41
docs/developer-guide/security-compliance-framework.md
Normal file
@@ -0,0 +1,41 @@
|
||||
# Create a new security compliance framework
|
||||
|
||||
|
||||
## Introduction
|
||||
If you want to create or contribute with your own security frameworks or add public ones to Prowler you need to make sure the checks are available if not you have to create your own. Then create a compliance file per provider like in `prowler/compliance/<provider>/` and name it as `<framework>_<version>_<provider>.json` then follow the following format to create yours.
|
||||
|
||||
## Compliance Framework
|
||||
Each file version of a framework will have the following structure at high level with the case that each framework needs to be generally identified, one requirement can be also called one control but one requirement can be linked to multiple prowler checks.:
|
||||
|
||||
- `Framework`: string. Distinguish name of the framework, like CIS
|
||||
- `Provider`: string. Provider where the framework applies, such as AWS, Azure, OCI,...
|
||||
- `Version`: string. Version of the framework itself, like 1.4 for CIS.
|
||||
- `Requirements`: array of objects. Include all requirements or controls with the mapping to Prowler.
|
||||
- `Requirements_Id`: string. Unique identifier per each requirement in the specific framework
|
||||
- `Requirements_Description`: string. Description as in the framework.
|
||||
- `Requirements_Attributes`: array of objects. Includes all needed attributes per each requirement, like levels, sections, etc. Whatever helps to create a dedicated report with the result of the findings. Attributes would be taken as closely as possible from the framework's own terminology directly.
|
||||
- `Requirements_Checks`: array. Prowler checks that are needed to prove this requirement. It can be one or multiple checks. In case of no automation possible this can be empty.
|
||||
|
||||
```
|
||||
{
|
||||
"Framework": "<framework>-<provider>",
|
||||
"Version": "<version>",
|
||||
"Requirements": [
|
||||
{
|
||||
"Id": "<unique-id>",
|
||||
"Description": "Requirement full description",
|
||||
"Checks": [
|
||||
"Here is the prowler check or checks that is going to be executed"
|
||||
],
|
||||
"Attributes": [
|
||||
{
|
||||
<Add here your custom attributes.>
|
||||
}
|
||||
]
|
||||
},
|
||||
...
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
Finally, to have a proper output file for your reports, your framework data model has to be created in `prowler/lib/outputs/models.py` and also the CLI table output in `prowler/lib/outputs/compliance.py`. Also, you need to add a new conditional in `prowler/lib/outputs/file_descriptors.py` if you create a new CSV model.
|
||||
235
docs/developer-guide/services.md
Normal file
@@ -0,0 +1,235 @@
|
||||
# Create a new Provider Service
|
||||
|
||||
Here you can find how to create a new service, or to complement an existing one, for a Prowler Provider.
|
||||
|
||||
## Introduction
|
||||
|
||||
To create a new service, you will need to create a folder inside the specific provider, i.e. `prowler/providers/<provider>/services/<service>/`.
|
||||
|
||||
Inside that folder, you MUST create the following files:
|
||||
|
||||
- An empty `__init__.py`: to make Python treat this service folder as a package.
|
||||
- A `<service>_service.py`, containing all the service's logic and API calls.
|
||||
- A `<service>_client_.py`, containing the initialization of the service's class we have just created so the checks's checks can use it.
|
||||
|
||||
## Service
|
||||
|
||||
The Prowler's service structure is the following and the way to initialise it is just by importing the service client in a check.
|
||||
|
||||
## Service Base Class
|
||||
|
||||
All the Prowler provider's services inherits from a base class depending on the provider used.
|
||||
|
||||
- [AWS Service Base Class](https://github.com/prowler-cloud/prowler/blob/22f8855ad7dad2e976dabff78611b643e234beaf/prowler/providers/aws/lib/service/service.py)
|
||||
- [GCP Service Base Class](https://github.com/prowler-cloud/prowler/blob/22f8855ad7dad2e976dabff78611b643e234beaf/prowler/providers/gcp/lib/service/service.py)
|
||||
- [Azure Service Base Class](https://github.com/prowler-cloud/prowler/blob/22f8855ad7dad2e976dabff78611b643e234beaf/prowler/providers/azure/lib/service/service.py)
|
||||
|
||||
Each class is used to initialize the credentials and the API's clients to be used in the service. If some threading is used it must be coded there.
|
||||
|
||||
## Service Class
|
||||
|
||||
Due to the complexity and differencies of each provider API we are going to use an example service to guide you in how can it be created.
|
||||
|
||||
The following is the `<service>_service.py` file:
|
||||
|
||||
```python title="Service Class"
|
||||
from datetime import datetime
|
||||
from typing import Optional
|
||||
|
||||
# The following is just for the AWS provider
|
||||
from botocore.client import ClientError
|
||||
|
||||
# To use the Pydantic's BaseModel
|
||||
from pydantic import BaseModel
|
||||
|
||||
# Prowler logging library
|
||||
from prowler.lib.logger import logger
|
||||
|
||||
# Prowler resource filter, only for the AWS provider
|
||||
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
|
||||
|
||||
# Provider parent class
|
||||
from prowler.providers.<provider>.lib.service.service import ServiceParentClass
|
||||
|
||||
|
||||
# Create a class for the Service
|
||||
################## <Service>
|
||||
class <Service>(ServiceParentClass):
|
||||
def __init__(self, audit_info):
|
||||
# Call Service Parent Class __init__
|
||||
# We use the __class__.__name__ to get it automatically
|
||||
# from the Service Class name but you can pass a custom
|
||||
# string if the provider's API service name is different
|
||||
super().__init__(__class__.__name__, audit_info)
|
||||
|
||||
# Create an empty dictionary of items to be gathered,
|
||||
# using the unique ID as the dictionary key
|
||||
# e.g., instances
|
||||
self.<items> = {}
|
||||
|
||||
# If you can parallelize by regions or locations
|
||||
# you can use the __threading_call__ function
|
||||
# available in the Service Parent Class
|
||||
self.__threading_call__(self.__describe_<items>__)
|
||||
|
||||
# Optionally you can create another function to retrieve
|
||||
# more data about each item without parallel
|
||||
self.__describe_<item>__()
|
||||
|
||||
def __describe_<items>__(self, regional_client):
|
||||
"""Get ALL <Service> <Items>"""
|
||||
logger.info("<Service> - Describing <Items>...")
|
||||
|
||||
# We MUST include a try/except block in each function
|
||||
try:
|
||||
|
||||
# Call to the provider API to retrieve the data we want
|
||||
describe_<items>_paginator = regional_client.get_paginator("describe_<items>")
|
||||
|
||||
# Paginator to get every item
|
||||
for page in describe_<items>_paginator.paginate():
|
||||
|
||||
# Another try/except within the loop for to continue looping
|
||||
# if something unexpected happens
|
||||
try:
|
||||
|
||||
for <item> in page["<Items>"]:
|
||||
|
||||
# For the AWS provider we MUST include the following lines to retrieve
|
||||
# or not data for the resource passed as argument using the --resource-arn
|
||||
if not self.audit_resources or (
|
||||
is_resource_filtered(<item>["<item_arn>"], self.audit_resources)
|
||||
):
|
||||
# Then we have to include the retrieved resource in the object
|
||||
# previously created
|
||||
self.<items>[<item_unique_id>] =
|
||||
<Item>(
|
||||
arn=stack["<item_arn>"],
|
||||
name=stack["<item_name>"],
|
||||
tags=stack.get("Tags", []),
|
||||
region=regional_client.region,
|
||||
)
|
||||
|
||||
except Exception as error:
|
||||
logger.error(
|
||||
f"{<provider_specific_field>} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
|
||||
)
|
||||
|
||||
# In the except part we have to use the following code to log the errors
|
||||
except Exception as error:
|
||||
# Depending on each provider we can use the following fields in the logger:
|
||||
# - AWS: regional_client.region or self.region
|
||||
# - GCP: project_id and location
|
||||
# - Azure: subscription
|
||||
|
||||
logger.error(
|
||||
f"{<provider_specific_field>} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
|
||||
)
|
||||
|
||||
def __describe_<item>__(self):
|
||||
"""Get Details for a <Service> <Item>"""
|
||||
logger.info("<Service> - Describing <Item> to get specific details...")
|
||||
|
||||
# We MUST include a try/except block in each function
|
||||
try:
|
||||
|
||||
# Loop over the items retrieved in the previous function
|
||||
for <item> in self.<items>:
|
||||
|
||||
# When we perform calls to the Provider API within a for loop we have
|
||||
# to include another try/except block because in the cloud there are
|
||||
# ephemeral resources that can be deleted at the time we are checking them
|
||||
try:
|
||||
<item>_details = self.regional_clients[<item>.region].describe_<item>(
|
||||
<Attribute>=<item>.name
|
||||
)
|
||||
|
||||
# For example, check if item is Public. Here is important if we are
|
||||
# getting values from a dictionary we have to use the "dict.get()"
|
||||
# function with a default value in the case this value is not present
|
||||
<item>.public = <item>_details.get("Public", False)
|
||||
|
||||
|
||||
# In this except block, for example for the AWS Provider we can use
|
||||
# the botocore.ClientError exception and check for a specific error code
|
||||
# to raise a WARNING instead of an ERROR if some resource is not present.
|
||||
except ClientError as error:
|
||||
if error.response["Error"]["Code"] == "InvalidInstanceID.NotFound":
|
||||
logger.warning(
|
||||
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
|
||||
)
|
||||
else:
|
||||
logger.error(
|
||||
f"{<provider_specific_field>} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
|
||||
)
|
||||
continue
|
||||
|
||||
# In the except part we have to use the following code to log the errors
|
||||
except Exception as error:
|
||||
# Depending on each provider we can use the following fields in the logger:
|
||||
# - AWS: regional_client.region or self.region
|
||||
# - GCP: project_id and location
|
||||
# - Azure: subscription
|
||||
|
||||
logger.error(
|
||||
f"{<item>.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
|
||||
)
|
||||
```
|
||||
|
||||
### Service Models
|
||||
|
||||
For each class object we need to model we use the Pydantic's [BaseModel](https://docs.pydantic.dev/latest/api/base_model/#pydantic.BaseModel) to take advantage of the data validation.
|
||||
|
||||
```python title="Service Model"
|
||||
# In each service class we have to create some classes using
|
||||
# the Pydantic's Basemodel for the resources we want to audit.
|
||||
class <Item>(BaseModel):
|
||||
"""<Item> holds a <Service> <Item>"""
|
||||
|
||||
arn: str
|
||||
"""<Items>[].arn"""
|
||||
|
||||
name: str
|
||||
"""<Items>[].name"""
|
||||
|
||||
region: str
|
||||
"""<Items>[].region"""
|
||||
|
||||
public: bool
|
||||
"""<Items>[].public"""
|
||||
|
||||
# We can create Optional attributes set to None by default
|
||||
tags: Optional[list]
|
||||
"""<Items>[].tags"""
|
||||
```
|
||||
### Service Objects
|
||||
In the service each group of resources should be created as a Python [dictionary](https://docs.python.org/3/tutorial/datastructures.html#dictionaries). This is because we are performing lookups all the time and the Python dictionary lookup has [O(1) complexity](https://en.wikipedia.org/wiki/Big_O_notation#Orders_of_common_functions).
|
||||
|
||||
We MUST set as the dictionary key a unique ID, like the resource Unique ID or ARN.
|
||||
|
||||
Example:
|
||||
```python
|
||||
self.vpcs = {}
|
||||
self.vpcs["vpc-01234567890abcdef"] = VPC_Object_Class()
|
||||
```
|
||||
|
||||
## Service Client
|
||||
|
||||
Each Prowler service requires a service client to use the service in the checks.
|
||||
|
||||
The following is the `<service>_client.py` containing the initialization of the service's class we have just created so the service's checks can use them:
|
||||
|
||||
```python
|
||||
from prowler.providers.<provider>.lib.audit_info.audit_info import audit_info
|
||||
from prowler.providers.<provider>.services.<service>.<service>_service import <Service>
|
||||
|
||||
<service>_client = <Service>(audit_info)
|
||||
```
|
||||
|
||||
## Permissions
|
||||
|
||||
It is really important to check if the current Prowler's permissions for each provider are enough to implement a new service. If we need to include more please refer to the following documentaion and update it:
|
||||
|
||||
- AWS: https://docs.prowler.cloud/en/latest/getting-started/requirements/#aws-authentication
|
||||
- Azure: https://docs.prowler.cloud/en/latest/getting-started/requirements/#permissions
|
||||
- GCP: https://docs.prowler.cloud/en/latest/getting-started/requirements/#gcp-authentication
|
||||
596
docs/developer-guide/unit-testing.md
Normal file
@@ -0,0 +1,596 @@
|
||||
# Unit Tests
|
||||
|
||||
The unit tests for the Prowler checks varies between each provider supported.
|
||||
|
||||
Here we left some good reads about unit testing and things we've learnt through all the process.
|
||||
|
||||
**Python Testing**
|
||||
|
||||
- https://docs.python-guide.org/writing/tests/
|
||||
|
||||
**Where to patch**
|
||||
|
||||
- https://docs.python.org/3/library/unittest.mock.html#where-to-patch
|
||||
- https://stackoverflow.com/questions/893333/multiple-variables-in-a-with-statement
|
||||
- https://docs.python.org/3/reference/compound_stmts.html#the-with-statement
|
||||
|
||||
**Utils to trace mocking and test execution**
|
||||
|
||||
- https://news.ycombinator.com/item?id=36054868
|
||||
- https://docs.python.org/3/library/sys.html#sys.settrace
|
||||
- https://github.com/kunalb/panopticon
|
||||
|
||||
## General Recommendations
|
||||
|
||||
When creating tests for some provider's checks we follow these guidelines trying to cover as much test scenarios as possible:
|
||||
|
||||
1. Create a test without resource to generate 0 findings, because Prowler will generate 0 findings if a service does not contain the resources the check is looking for audit.
|
||||
2. Create test to generate both a `PASS` and a `FAIL` result.
|
||||
3. Create tests with more than 1 resource to evaluate how the check behaves and if the number of findings is right.
|
||||
|
||||
## How to run Prowler tests
|
||||
|
||||
To run the Prowler test suite you need to install the testing dependencies already included in the `pyproject.toml` file. If you didn't install it yet please read the developer guide introduction [here](./introduction.md#get-the-code-and-install-all-dependencies).
|
||||
|
||||
Then in the project's root path execute `pytest -n auto -vvv -s -x` or use the `Makefile` with `make test`.
|
||||
|
||||
Other commands to run tests:
|
||||
|
||||
- Run tests for a provider: `pytest -n auto -vvv -s -x tests/providers/<provider>/services`
|
||||
- Run tests for a provider service: `pytest -n auto -vvv -s -x tests/providers/<provider>/services/<service>`
|
||||
- Run tests for a provider check: `pytest -n auto -vvv -s -x tests/providers/<provider>/services/<service>/<check>`
|
||||
|
||||
???+ note
|
||||
Refer to the [pytest documentation](https://docs.pytest.org/en/7.1.x/getting-started.html) documentation for more information.
|
||||
|
||||
## AWS
|
||||
|
||||
For the AWS provider we have ways to test a Prowler check based on the following criteria:
|
||||
|
||||
???+ note
|
||||
We use and contribute to the [Moto](https://github.com/getmoto/moto) library which allows us to easily mock out tests based on AWS infrastructure. **It's awesome!**
|
||||
|
||||
- AWS API calls covered by [Moto](https://github.com/getmoto/moto):
|
||||
- Service tests with `@mock_<service>`
|
||||
- Checks tests with `@mock_<service>`
|
||||
- AWS API calls not covered by Moto:
|
||||
- Service test with `mock_make_api_call`
|
||||
- Checks tests with [MagicMock](https://docs.python.org/3/library/unittest.mock.html#unittest.mock.MagicMock)
|
||||
- AWS API calls partially covered by Moto:
|
||||
- Service test with `@mock_<service>` and `mock_make_api_call`
|
||||
- Checks tests with `@mock_<service>` and `mock_make_api_call`
|
||||
|
||||
In the following section we are going to explain all of the above scenarios with examples. The main difference between those scenarios comes from if the [Moto](https://github.com/getmoto/moto) library covers the AWS API calls made by the service. You can check the covered API calls [here](https://github.com/getmoto/moto/blob/master/IMPLEMENTATION_COVERAGE.md).
|
||||
|
||||
An important point for the AWS testing is that in each check we MUST have a unique `audit_info` which is the key object during the AWS execution to isolate the test execution.
|
||||
|
||||
Check the [Audit Info](./audit-info.md) section to get more details.
|
||||
|
||||
```python
|
||||
# We need to import the AWS_Audit_Info and the Audit_Metadata
|
||||
# to set the audit_info to call AWS APIs
|
||||
from prowler.providers.aws.lib.audit_info.models import AWS_Audit_Info
|
||||
from prowler.providers.common.models import Audit_Metadata
|
||||
|
||||
AWS_ACCOUNT_NUMBER = "123456789012"
|
||||
|
||||
def set_mocked_audit_info(self):
|
||||
audit_info = AWS_Audit_Info(
|
||||
session_config=None,
|
||||
original_session=None,
|
||||
audit_session=session.Session(
|
||||
profile_name=None,
|
||||
botocore_session=None,
|
||||
),
|
||||
audit_config=None,
|
||||
audited_account=AWS_ACCOUNT_NUMBER,
|
||||
audited_account_arn=f"arn:aws:iam::{AWS_ACCOUNT_NUMBER}:root",
|
||||
audited_user_id=None,
|
||||
audited_partition="aws",
|
||||
audited_identity_arn=None,
|
||||
profile=None,
|
||||
profile_region=None,
|
||||
credentials=None,
|
||||
assumed_role_info=None,
|
||||
audited_regions=["us-east-1", "eu-west-1"],
|
||||
organizations_metadata=None,
|
||||
audit_resources=None,
|
||||
mfa_enabled=False,
|
||||
audit_metadata=Audit_Metadata(
|
||||
services_scanned=0,
|
||||
expected_checks=[],
|
||||
completed_checks=0,
|
||||
audit_progress=0,
|
||||
),
|
||||
)
|
||||
|
||||
return audit_info
|
||||
```
|
||||
### Checks
|
||||
|
||||
For the AWS tests examples we are going to use the tests for the `iam_password_policy_uppercase` check.
|
||||
|
||||
This section is going to be divided based on the API coverage of the [Moto](https://github.com/getmoto/moto) library.
|
||||
|
||||
#### API calls covered
|
||||
|
||||
If the [Moto](https://github.com/getmoto/moto) library covers the API calls we want to test, we can use the `@mock_<service>` decorator. This will mocked out all the API calls made to AWS keeping the state within the code decorated, in this case the test function.
|
||||
|
||||
```python
|
||||
# We need to import the unittest.mock to allow us to patch some objects
|
||||
# not to use shared ones between test, hence to isolate the test
|
||||
from unittest import mock
|
||||
|
||||
# Boto3 client and session to call the AWS APIs
|
||||
from boto3 import client, session
|
||||
|
||||
# Moto decorator for the IAM service we want to mock
|
||||
from moto import mock_iam
|
||||
|
||||
# Constants used
|
||||
AWS_ACCOUNT_NUMBER = "123456789012"
|
||||
AWS_REGION = "us-east-1"
|
||||
|
||||
|
||||
# We always name the test classes like Test_<check_name>
|
||||
class Test_iam_password_policy_uppercase:
|
||||
|
||||
# We include the Moto decorator for the service we want to use
|
||||
# You can include more than one if two or more services are
|
||||
# involved in test
|
||||
@mock_iam
|
||||
# We name the tests with test_<service>_<check_name>_<test_action>
|
||||
def test_iam_password_policy_no_uppercase_flag(self):
|
||||
# First, we have to create an IAM client
|
||||
iam_client = client("iam", region_name=AWS_REGION)
|
||||
|
||||
# Then, since all the AWS accounts have a password
|
||||
# policy we want to set to False the RequireUppercaseCharacters
|
||||
iam_client.update_account_password_policy(RequireUppercaseCharacters=False)
|
||||
|
||||
# We set a mocked audit_info for AWS not to share the same audit state
|
||||
# between checks
|
||||
current_audit_info = self.set_mocked_audit_info()
|
||||
|
||||
# The Prowler service import MUST be made within the decorated
|
||||
# code not to make real API calls to the AWS service.
|
||||
from prowler.providers.aws.services.iam.iam_service import IAM
|
||||
|
||||
# Prowler for AWS uses a shared object called `current_audit_info` where it stores
|
||||
# the audit's state, credentials and configuration.
|
||||
with mock.patch(
|
||||
"prowler.providers.aws.lib.audit_info.audit_info.current_audit_info",
|
||||
new=current_audit_info,
|
||||
),
|
||||
# We have to mock also the iam_client from the check to enforce that the iam_client used is the one
|
||||
# created within this check because patch != import, and if you execute tests in parallel some objects
|
||||
# can be already initialised hence the check won't be isolated
|
||||
mock.patch(
|
||||
"prowler.providers.aws.services.iam.iam_password_policy_uppercase.iam_password_policy_uppercase.iam_client",
|
||||
new=IAM(current_audit_info),
|
||||
):
|
||||
# We import the check within the two mocks not to initialise the iam_client with some shared information from
|
||||
# the current_audit_info or the IAM service.
|
||||
from prowler.providers.aws.services.iam.iam_password_policy_uppercase.iam_password_policy_uppercase import (
|
||||
iam_password_policy_uppercase,
|
||||
)
|
||||
|
||||
# Once imported, we only need to instantiate the check's class
|
||||
check = iam_password_policy_uppercase()
|
||||
|
||||
# And then, call the execute() function to run the check
|
||||
# against the IAM client we've set up.
|
||||
result = check.execute()
|
||||
|
||||
# Last but not least, we need to assert all the fields
|
||||
# from the check's results
|
||||
assert len(results) == 1
|
||||
assert result[0].status == "FAIL"
|
||||
assert result[0].status_extended == "IAM password policy does not require at least one uppercase letter."
|
||||
assert result[0].resource_arn == f"arn:aws:iam::{AWS_ACCOUNT_NUMBER}:root"
|
||||
assert result[0].resource_id == AWS_ACCOUNT_NUMBER
|
||||
assert result[0].resource_tags == []
|
||||
assert result[0].region == AWS_REGION
|
||||
```
|
||||
|
||||
#### API calls not covered
|
||||
|
||||
If the IAM service for the check's we want to test is not covered by Moto, we have to inject the objects in the service client using [MagicMock](https://docs.python.org/3/library/unittest.mock.html#unittest.mock.MagicMock). As we have pointed above, we cannot instantiate the service since it will make real calls to the AWS APIs.
|
||||
|
||||
???+ note
|
||||
The following example uses the IAM GetAccountPasswordPolicy which is covered by Moto but this is only for demonstration purposes.
|
||||
|
||||
The following code shows how to use MagicMock to create the service objects.
|
||||
|
||||
```python
|
||||
# We need to import the unittest.mock to allow us to patch some objects
|
||||
# not to use shared ones between test, hence to isolate the test
|
||||
from unittest import mock
|
||||
|
||||
# Constants used
|
||||
AWS_ACCOUNT_NUMBER = "123456789012"
|
||||
AWS_REGION = "us-east-1"
|
||||
|
||||
|
||||
# We always name the test classes like Test_<check_name>
|
||||
class Test_iam_password_policy_uppercase:
|
||||
|
||||
# We name the tests with test_<service>_<check_name>_<test_action>
|
||||
def test_iam_password_policy_no_uppercase_flag(self):
|
||||
# Mocked client with MagicMock
|
||||
mocked_iam_client = mock.MagicMock
|
||||
|
||||
# Since the IAM Password Policy has their own model we have to import it
|
||||
from prowler.providers.aws.services.iam.iam_service import PasswordPolicy
|
||||
|
||||
# Create the mock PasswordPolicy object
|
||||
mocked_iam_client.password_policy = PasswordPolicy(
|
||||
length=5,
|
||||
symbols=True,
|
||||
numbers=True,
|
||||
# We set the value to False to test the check
|
||||
uppercase=False,
|
||||
lowercase=True,
|
||||
allow_change=False,
|
||||
expiration=True,
|
||||
)
|
||||
|
||||
# We set a mocked audit_info for AWS not to share the same audit state
|
||||
# between checks
|
||||
current_audit_info = self.set_mocked_audit_info()
|
||||
|
||||
# In this scenario we have to mock also the IAM service and the iam_client from the check to enforce # that the iam_client used is the one created within this check because patch != import, and if you # execute tests in parallel some objects can be already initialised hence the check won't be isolated.
|
||||
# In this case we don't use the Moto decorator, we use the mocked IAM client for both objects
|
||||
with mock.patch(
|
||||
"prowler.providers.aws.services.iam.iam_service.IAM",
|
||||
new=mocked_iam_client,
|
||||
), mock.patch(
|
||||
"prowler.providers.aws.services.iam.iam_client.iam_client",
|
||||
new=mocked_iam_client,
|
||||
):
|
||||
# We import the check within the two mocks not to initialise the iam_client with some shared information from
|
||||
# the current_audit_info or the IAM service.
|
||||
from prowler.providers.aws.services.iam.iam_password_policy_uppercase.iam_password_policy_uppercase import (
|
||||
iam_password_policy_uppercase,
|
||||
)
|
||||
|
||||
# Once imported, we only need to instantiate the check's class
|
||||
check = iam_password_policy_uppercase()
|
||||
|
||||
# And then, call the execute() function to run the check
|
||||
# against the IAM client we've set up.
|
||||
result = check.execute()
|
||||
|
||||
# Last but not least, we need to assert all the fields
|
||||
# from the check's results
|
||||
assert len(results) == 1
|
||||
assert result[0].status == "FAIL"
|
||||
assert result[0].status_extended == "IAM password policy does not require at least one uppercase letter."
|
||||
assert result[0].resource_arn == f"arn:aws:iam::{AWS_ACCOUNT_NUMBER}:root"
|
||||
assert result[0].resource_id == AWS_ACCOUNT_NUMBER
|
||||
assert result[0].resource_tags == []
|
||||
assert result[0].region == AWS_REGION
|
||||
```
|
||||
|
||||
As it can be seen in the above scenarios, the check execution should always be into the context of mocked/patched objects. This way we ensure it reviews only the objects created under the scope the test.
|
||||
|
||||
#### API calls partially covered
|
||||
|
||||
If the API calls we want to use in the service are partially covered by the Moto decorator we have to create our own mocked API calls to use it in combination.
|
||||
|
||||
To do so, you need to mock the `botocore.client.BaseClient._make_api_call` function, which is the Boto3 function in charge of making the real API call to the AWS APIs, using `mock.patch <https://docs.python.org/3/library/unittest.mock.html#patch>`:
|
||||
|
||||
|
||||
```python
|
||||
|
||||
import boto3
|
||||
import botocore
|
||||
from unittest.mock import patch
|
||||
from moto import mock_iam
|
||||
|
||||
# Original botocore _make_api_call function
|
||||
orig = botocore.client.BaseClient._make_api_call
|
||||
|
||||
# Mocked botocore _make_api_call function
|
||||
def mock_make_api_call(self, operation_name, kwarg):
|
||||
# As you can see the operation_name has the get_account_password_policy snake_case form but
|
||||
# we are using the GetAccountPasswordPolicy form.
|
||||
# Rationale -> https://github.com/boto/botocore/blob/develop/botocore/client.py#L810:L816
|
||||
if operation_name == 'GetAccountPasswordPolicy':
|
||||
return {
|
||||
'PasswordPolicy': {
|
||||
'MinimumPasswordLength': 123,
|
||||
'RequireSymbols': True|False,
|
||||
'RequireNumbers': True|False,
|
||||
'RequireUppercaseCharacters': True|False,
|
||||
'RequireLowercaseCharacters': True|False,
|
||||
'AllowUsersToChangePassword': True|False,
|
||||
'ExpirePasswords': True|False,
|
||||
'MaxPasswordAge': 123,
|
||||
'PasswordReusePrevention': 123,
|
||||
'HardExpiry': True|False
|
||||
}
|
||||
}
|
||||
# If we don't want to patch the API call
|
||||
return orig(self, operation_name, kwarg)
|
||||
|
||||
# We always name the test classes like Test_<check_name>
|
||||
class Test_iam_password_policy_uppercase:
|
||||
|
||||
# We include the custom API call mock decorator for the service we want to use
|
||||
@patch("botocore.client.BaseClient._make_api_call", new=mock_make_api_call)
|
||||
# We include also the IAM Moto decorator for the API calls supported
|
||||
@mock_iam
|
||||
# We name the tests with test_<service>_<check_name>_<test_action>
|
||||
def test_iam_password_policy_no_uppercase_flag(self):
|
||||
# Check the previous section to see the check test since is the same
|
||||
```
|
||||
|
||||
Note that this does not use Moto, to keep it simple, but if you use any `moto`-decorators in addition to the patch, the call to `orig(self, operation_name, kwarg)` will be intercepted by Moto.
|
||||
|
||||
???+ note
|
||||
The above code comes from here https://docs.getmoto.org/en/latest/docs/services/patching_other_services.html
|
||||
|
||||
#### Mocking more than one service
|
||||
|
||||
If the test your are creating belongs to a check that uses more than one provider service, you should mock each of the services used. For example, the check `cloudtrail_logs_s3_bucket_access_logging_enabled` requires the CloudTrail and the S3 client, hence the service's mock part of the test will be as follows:
|
||||
|
||||
|
||||
```python
|
||||
with mock.patch(
|
||||
"prowler.providers.aws.lib.audit_info.audit_info.current_audit_info",
|
||||
new=mock_audit_info,
|
||||
), mock.patch(
|
||||
"prowler.providers.aws.services.cloudtrail.cloudtrail_logs_s3_bucket_access_logging_enabled.cloudtrail_logs_s3_bucket_access_logging_enabled.cloudtrail_client",
|
||||
new=Cloudtrail(mock_audit_info),
|
||||
), mock.patch(
|
||||
"prowler.providers.aws.services.cloudtrail.cloudtrail_logs_s3_bucket_access_logging_enabled.cloudtrail_logs_s3_bucket_access_logging_enabled.s3_client",
|
||||
new=S3(mock_audit_info),
|
||||
):
|
||||
```
|
||||
|
||||
|
||||
As you can see in the above code, it is required to mock the AWS audit info and both services used.
|
||||
|
||||
|
||||
#### Patching vs. Importing
|
||||
|
||||
This is an important topic within the Prowler check's unit testing. Due to the dynamic nature of the check's load, the process of importing the service client from a check is the following:
|
||||
|
||||
1. `<check>.py`:
|
||||
```python
|
||||
from prowler.providers.<provider>.services.<service>.<service>_client import <service>_client
|
||||
```
|
||||
2. `<service>_client.py`:
|
||||
```python
|
||||
from prowler.providers.<provider>.lib.audit_info.audit_info import audit_info
|
||||
from prowler.providers.<provider>.services.<service>.<service>_service import <SERVICE>
|
||||
|
||||
<service>_client = <SERVICE>(audit_info)
|
||||
```
|
||||
|
||||
Due to the above import path it's not the same to patch the following objects because if you run a bunch of tests, either in parallel or not, some clients can be already instantiated by another check, hence your test execution will be using another test's service instance:
|
||||
|
||||
- `<service>_client` imported at `<check>.py`
|
||||
- `<service>_client` initialised at `<service>_client.py`
|
||||
- `<SERVICE>` imported at `<service>_client.py`
|
||||
|
||||
A useful read about this topic can be found in the following article: https://stackoverflow.com/questions/8658043/how-to-mock-an-import
|
||||
|
||||
|
||||
#### Different ways to mock the service client
|
||||
|
||||
##### Mocking the service client at the service client level
|
||||
|
||||
Mocking a service client using the following code ...
|
||||
|
||||
```python title="Mocking the service_client"
|
||||
with mock.patch(
|
||||
"prowler.providers.<provider>.lib.audit_info.audit_info.audit_info",
|
||||
new=audit_info,
|
||||
), mock.patch(
|
||||
"prowler.providers.<provider>.services.<service>.<check>.<check>.<service>_client",
|
||||
new=<SERVICE>(audit_info),
|
||||
):
|
||||
```
|
||||
will cause that the service will be initialised twice:
|
||||
|
||||
1. When the `<SERVICE>(audit_info)` is mocked out using `mock.patch` to have the object ready for the patching.
|
||||
2. At the `<service>_client.py` when we are patching it since the `mock.patch` needs to go to that object an initialise it, hence the `<SERVICE>(audit_info)` will be called again.
|
||||
|
||||
Then, when we import the `<service>_client.py` at `<check>.py`, since we are mocking where the object is used, Python will use the mocked one.
|
||||
|
||||
In the [next section](./unit-testing.md#mocking-the-service-and-the-service-client-at-the-service-client-level) you will see an improved version to mock objects.
|
||||
|
||||
|
||||
##### Mocking the service and the service client at the service client level
|
||||
Mocking a service client using the following code ...
|
||||
|
||||
```python title="Mocking the service and the service_client"
|
||||
with mock.patch(
|
||||
"prowler.providers.<provider>.lib.audit_info.audit_info.audit_info",
|
||||
new=audit_info,
|
||||
), mock.patch(
|
||||
"prowler.providers.<provider>.services.<service>.<SERVICE>",
|
||||
new=<SERVICE>(audit_info),
|
||||
) as service_client, mock.patch(
|
||||
"prowler.providers.<provider>.services.<service>.<service>_client.<service>_client",
|
||||
new=service_client,
|
||||
):
|
||||
```
|
||||
will cause that the service will be initialised once, just when the `<SERVICE>(audit_info)` is mocked out using `mock.patch`.
|
||||
|
||||
Then, at the check_level when Python tries to import the client with `from prowler.providers.<provider>.services.<service>.<service>_client`, since it is already mocked out, the execution will continue using the `service_client` without getting into the `<service>_client.py`.
|
||||
|
||||
|
||||
### Services
|
||||
|
||||
For testing the AWS services we have to follow the same logic as with the AWS checks, we have to check if the AWS API calls made by the service are covered by Moto and we have to test the service `__init__` to verifiy that the information is being correctly retrieved.
|
||||
|
||||
The service tests could act as *Integration Tests* since we test how the service retrieves the information from the provider, but since Moto or the custom mock objects mocks that calls this test will fall into *Unit Tests*.
|
||||
|
||||
Please refer to the [AWS checks tests](./unit-testing.md#checks) for more information on how to create tests and check the existing services tests [here](https://github.com/prowler-cloud/prowler/tree/master/tests/providers/aws/services).
|
||||
|
||||
## GCP
|
||||
|
||||
### Checks
|
||||
|
||||
For the GCP Provider we don't have any library to mock out the API calls we use. So in this scenario we inject the objects in the service client using [MagicMock](https://docs.python.org/3/library/unittest.mock.html#unittest.mock.MagicMock).
|
||||
|
||||
The following code shows how to use MagicMock to create the service objects for a GCP check test.
|
||||
|
||||
```python
|
||||
# We need to import the unittest.mock to allow us to patch some objects
|
||||
# not to use shared ones between test, hence to isolate the test
|
||||
from unittest import mock
|
||||
|
||||
# GCP Constants
|
||||
GCP_PROJECT_ID = "123456789012"
|
||||
|
||||
# We are going to create a test for the compute_firewall_rdp_access_from_the_internet_allowed check
|
||||
class Test_compute_firewall_rdp_access_from_the_internet_allowed:
|
||||
|
||||
# We name the tests with test_<service>_<check_name>_<test_action>
|
||||
def test_compute_compute_firewall_rdp_access_from_the_internet_allowed_one_compliant_rule_with_valid_port(self):
|
||||
# Mocked client with MagicMock
|
||||
compute_client = mock.MagicMock
|
||||
|
||||
# Assign GCP client configuration
|
||||
compute_client.project_ids = [GCP_PROJECT_ID]
|
||||
compute_client.region = "global"
|
||||
|
||||
# Import the service resource model to create the mocked object
|
||||
from prowler.providers.gcp.services.compute.compute_service import Firewall
|
||||
|
||||
# Create the custom Firewall object to be tested
|
||||
firewall = Firewall(
|
||||
name="test",
|
||||
id="1234567890",
|
||||
source_ranges=["0.0.0.0/0"],
|
||||
direction="INGRESS",
|
||||
allowed_rules=[{"IPProtocol": "tcp", "ports": ["443"]}],
|
||||
project_id=GCP_PROJECT_ID,
|
||||
)
|
||||
compute_client.firewalls = [firewall]
|
||||
|
||||
# In this scenario we have to mock also the Compute service and the compute_client from the check to enforce that the compute_client used is the one created within this check because patch != import, and if you execute tests in parallel some objects can be already initialised hence the check won't be isolated.
|
||||
# In this case we don't use the Moto decorator, we use the mocked Compute client for both objects
|
||||
with mock.patch(
|
||||
"prowler.providers.gcp.services.compute.compute_service.Compute",
|
||||
new=defender_client,
|
||||
), mock.patch(
|
||||
"prowler.providers.gcp.services.compute.compute_client.compute_client",
|
||||
new=defender_client,
|
||||
):
|
||||
|
||||
# We import the check within the two mocks not to initialise the iam_client with some shared information from
|
||||
# the current_audit_info or the Compute service.
|
||||
from prowler.providers.gcp.services.compute.compute_firewall_rdp_access_from_the_internet_allowed.compute_firewall_rdp_access_from_the_internet_allowed import (
|
||||
compute_firewall_rdp_access_from_the_internet_allowed,
|
||||
)
|
||||
|
||||
# Once imported, we only need to instantiate the check's class
|
||||
check = compute_firewall_rdp_access_from_the_internet_allowed()
|
||||
|
||||
# And then, call the execute() function to run the check
|
||||
# against the IAM client we've set up.
|
||||
result = check.execute()
|
||||
|
||||
# Last but not least, we need to assert all the fields
|
||||
# from the check's results
|
||||
assert len(result) == 1
|
||||
assert result[0].status == "PASS"
|
||||
assert result[0].status_extended == f"Firewall {firewall.name} does not expose port 3389 (RDP) to the internet."
|
||||
assert result[0].resource_name = firewall.name
|
||||
assert result[0].resource_id == firewall.id
|
||||
assert result[0].project_id = GCP_PROJECT_ID
|
||||
assert result[0].location = compute_client.region
|
||||
```
|
||||
|
||||
### Services
|
||||
|
||||
Coming soon ...
|
||||
|
||||
## Azure
|
||||
|
||||
### Checks
|
||||
|
||||
For the Azure Provider we don't have any library to mock out the API calls we use. So in this scenario we inject the objects in the service client using [MagicMock](https://docs.python.org/3/library/unittest.mock.html#unittest.mock.MagicMock).
|
||||
|
||||
The following code shows how to use MagicMock to create the service objects for a Azure check test.
|
||||
|
||||
```python
|
||||
# We need to import the unittest.mock to allow us to patch some objects
|
||||
# not to use shared ones between test, hence to isolate the test
|
||||
from unittest import mock
|
||||
|
||||
from uuid import uuid4
|
||||
|
||||
# Azure Constants
|
||||
from tests.providers.azure.azure_fixtures import AZURE_SUBSCRIPTION
|
||||
|
||||
|
||||
|
||||
# We are going to create a test for the Test_defender_ensure_defender_for_arm_is_on check
|
||||
class Test_defender_ensure_defender_for_arm_is_on:
|
||||
|
||||
# We name the tests with test_<service>_<check_name>_<test_action>
|
||||
def test_defender_defender_ensure_defender_for_arm_is_on_arm_pricing_tier_not_standard(self):
|
||||
resource_id = str(uuid4())
|
||||
|
||||
# Mocked client with MagicMock
|
||||
defender_client = mock.MagicMock
|
||||
|
||||
# Import the service resource model to create the mocked object
|
||||
from prowler.providers.azure.services.defender.defender_service import Defender_Pricing
|
||||
|
||||
# Create the custom Defender object to be tested
|
||||
defender_client.pricings = {
|
||||
AZURE_SUBSCRIPTION: {
|
||||
"Arm": Defender_Pricing(
|
||||
resource_id=resource_id,
|
||||
pricing_tier="Not Standard",
|
||||
free_trial_remaining_time=0,
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
# In this scenario we have to mock also the Defender service and the defender_client from the check to enforce that the defender_client used is the one created within this check because patch != import, and if you execute tests in parallel some objects can be already initialised hence the check won't be isolated.
|
||||
# In this case we don't use the Moto decorator, we use the mocked Defender client for both objects
|
||||
with mock.patch(
|
||||
"prowler.providers.azure.services.defender.defender_service.Defender",
|
||||
new=defender_client,
|
||||
), mock.patch(
|
||||
"prowler.providers.azure.services.defender.defender_client.defender_client",
|
||||
new=defender_client,
|
||||
):
|
||||
|
||||
# We import the check within the two mocks not to initialise the iam_client with some shared information from
|
||||
# the current_audit_info or the Defender service.
|
||||
from prowler.providers.azure.services.defender.defender_ensure_defender_for_arm_is_on.defender_ensure_defender_for_arm_is_on import (
|
||||
defender_ensure_defender_for_arm_is_on,
|
||||
)
|
||||
|
||||
# Once imported, we only need to instantiate the check's class
|
||||
check = defender_ensure_defender_for_arm_is_on()
|
||||
|
||||
# And then, call the execute() function to run the check
|
||||
# against the IAM client we've set up.
|
||||
result = check.execute()
|
||||
|
||||
# Last but not least, we need to assert all the fields
|
||||
# from the check's results
|
||||
assert len(result) == 1
|
||||
assert result[0].status == "FAIL"
|
||||
assert (
|
||||
result[0].status_extended
|
||||
== f"Defender plan Defender for ARM from subscription {AZURE_SUBSCRIPTION} is set to OFF (pricing tier not standard)"
|
||||
)
|
||||
assert result[0].subscription == AZURE_SUBSCRIPTION
|
||||
assert result[0].resource_name == "Defender plan ARM"
|
||||
assert result[0].resource_id == resource_id
|
||||
```
|
||||
|
||||
### Services
|
||||
|
||||
Coming soon ...
|
||||
@@ -5,7 +5,7 @@ Prowler has been written in Python using the [AWS SDK (Boto3)](https://boto3.ama
|
||||
|
||||
Since Prowler uses AWS Credentials under the hood, you can follow any authentication method as described [here](https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-quickstart.html#cli-configure-quickstart-precedence).
|
||||
|
||||
### AWS Authentication
|
||||
### Authentication
|
||||
|
||||
Make sure you have properly configured your AWS-CLI with a valid Access Key and Region or declare AWS variables properly (or instance profile/role):
|
||||
|
||||
@@ -23,12 +23,11 @@ export AWS_SESSION_TOKEN="XXXXXXXXX"
|
||||
|
||||
Those credentials must be associated to a user or role with proper permissions to do all checks. To make sure, add the following AWS managed policies to the user or role being used:
|
||||
|
||||
- arn:aws:iam::aws:policy/SecurityAudit
|
||||
- arn:aws:iam::aws:policy/job-function/ViewOnlyAccess
|
||||
- `arn:aws:iam::aws:policy/SecurityAudit`
|
||||
- `arn:aws:iam::aws:policy/job-function/ViewOnlyAccess`
|
||||
|
||||
> Moreover, some read-only additional permissions are needed for several checks, make sure you attach also the custom policy [prowler-additions-policy.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-additions-policy.json) to the role you are using.
|
||||
|
||||
> If you want Prowler to send findings to [AWS Security Hub](https://aws.amazon.com/security-hub), make sure you also attach the custom policy [prowler-security-hub.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-security-hub.json).
|
||||
???+ note
|
||||
Moreover, some read-only additional permissions are needed for several checks, make sure you attach also the custom policy [prowler-additions-policy.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-additions-policy.json) to the role you are using. If you want Prowler to send findings to [AWS Security Hub](https://aws.amazon.com/security-hub), make sure you also attach the custom policy [prowler-security-hub.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-security-hub.json).
|
||||
|
||||
### Multi-Factor Authentication
|
||||
|
||||
@@ -39,7 +38,7 @@ If your IAM entity enforces MFA you can use `--mfa` and Prowler will ask you to
|
||||
|
||||
## Azure
|
||||
|
||||
Prowler for azure supports the following authentication types:
|
||||
Prowler for Azure supports the following authentication types:
|
||||
|
||||
- Service principal authentication by environment variables (Enterprise Application)
|
||||
- Current az cli credentials stored
|
||||
@@ -63,33 +62,61 @@ The other three cases does not need additional configuration, `--az-cli-auth` an
|
||||
|
||||
### Permissions
|
||||
|
||||
To use each one you need to pass the proper flag to the execution. Prowler fro Azure handles two types of permission scopes, which are:
|
||||
To use each one you need to pass the proper flag to the execution. Prowler for Azure handles two types of permission scopes, which are:
|
||||
|
||||
- **Azure Active Directory permissions**: Used to retrieve metadata from the identity assumed by Prowler and future AAD checks (not mandatory to have access to execute the tool)
|
||||
- **Microsoft Entra ID permissions**: Used to retrieve metadata from the identity assumed by Prowler (not mandatory to have access to execute the tool).
|
||||
- **Subscription scope permissions**: Required to launch the checks against your resources, mandatory to launch the tool.
|
||||
|
||||
|
||||
#### Azure Active Directory scope
|
||||
#### Microsoft Entra ID scope
|
||||
|
||||
Azure Active Directory (AAD) permissions required by the tool are the following:
|
||||
Microsoft Entra ID (AAD earlier) permissions required by the tool are the following:
|
||||
|
||||
- `Directory.Read.All`
|
||||
- `Policy.Read.All`
|
||||
- `UserAuthenticationMethod.Read.All`
|
||||
|
||||
The best way to assign it is through the azure web console:
|
||||
The best way to assign it is through the Azure web console:
|
||||
|
||||
1. Access to Microsoft Entra ID
|
||||
2. In the left menu bar, go to "App registrations"
|
||||
3. Once there, in the menu bar click on "+ New registration" to register a new application
|
||||
4. Fill the "Name, select the "Supported account types" and click on "Register. You will be redirected to the applications page.
|
||||

|
||||
4. Select the new application
|
||||
5. In the left menu bar, select "API permissions"
|
||||
6. Then click on "+ Add a permission" and select "Microsoft Graph"
|
||||
7. Once in the "Microsoft Graph" view, select "Application permissions"
|
||||
8. Finally, search for "Directory", "Policy" and "UserAuthenticationMethod" select the following permissions:
|
||||
- `Directory.Read.All`
|
||||
- `Policy.Read.All`
|
||||
- `UserAuthenticationMethod.Read.All`
|
||||

|
||||
|
||||

|
||||
|
||||
#### Subscriptions scope
|
||||
|
||||
Regarding the subscription scope, Prowler by default scans all the subscriptions that is able to list, so it is required to add the following RBAC builtin roles per subscription to the entity that is going to be assumed by the tool:
|
||||
Regarding the subscription scope, Prowler by default scans all the subscriptions that is able to list, so it is required to add the following RBAC builtin roles per subscription to the entity that is going to be assumed by the tool:
|
||||
|
||||
- `Security Reader`
|
||||
- `Reader`
|
||||
|
||||
To assign this roles, follow the instructions:
|
||||
|
||||
1. Access your subscription, then select your subscription.
|
||||
2. Select "Access control (IAM)".
|
||||
3. In the overview, select "Roles"
|
||||

|
||||
4. Click on "+ Add" and select "Add role assignment"
|
||||
5. In the search bar, type `Security Reader`, select it and click on "Next"
|
||||
6. In the Members tab, click on "+ Select members" and add the members you want to assign this role.
|
||||
7. Click on "Review + assign" to apply the new role.
|
||||
|
||||
*Repeat these steps for `Reader` role*
|
||||
|
||||
## Google Cloud
|
||||
|
||||
### GCP Authentication
|
||||
### Authentication
|
||||
|
||||
Prowler will follow the same credentials search as [Google authentication libraries](https://cloud.google.com/docs/authentication/application-default-credentials#search_order):
|
||||
|
||||
@@ -97,10 +124,7 @@ Prowler will follow the same credentials search as [Google authentication librar
|
||||
2. [User credentials set up by using the Google Cloud CLI](https://cloud.google.com/docs/authentication/application-default-credentials#personal)
|
||||
3. [The attached service account, returned by the metadata server](https://cloud.google.com/docs/authentication/application-default-credentials#attached-sa)
|
||||
|
||||
Those credentials must be associated to a user or service account with proper permissions to do all checks. To make sure, add the following roles to the member associated with the credentials:
|
||||
Those credentials must be associated to a user or service account with proper permissions to do all checks. To make sure, add the `Viewer` role to the member associated with the credentials.
|
||||
|
||||
- Viewer
|
||||
- Security Reviewer
|
||||
- Stackdriver Account Viewer
|
||||
|
||||
> By default, `prowler` will scan all accessible GCP Projects, use flag `--project-ids` to specify the projects to be scanned.
|
||||
???+ note
|
||||
By default, `prowler` will scan all accessible GCP Projects, use flag `--project-ids` to specify the projects to be scanned.
|
||||
|
||||
|
Before Width: | Height: | Size: 358 KiB After Width: | Height: | Size: 376 KiB |
BIN
docs/img/page-IAM.png
Normal file
|
After Width: | Height: | Size: 348 KiB |
BIN
docs/img/prowler-logo-black.png
Normal file
|
After Width: | Height: | Size: 9.2 KiB |
BIN
docs/img/prowler-logo-white.png
Normal file
|
After Width: | Height: | Size: 11 KiB |
BIN
docs/img/register-application.png
Normal file
|
After Width: | Height: | Size: 302 KiB |
114
docs/index.md
@@ -1,38 +1,13 @@
|
||||
<p href="https://github.com/prowler-cloud/prowler">
|
||||
<img align="right" src="./img/prowler-logo.png" height="100">
|
||||
</p>
|
||||
<br>
|
||||
**Prowler** is an Open Source security tool to perform AWS, Azure and Google Cloud security best practices assessments, audits, incident response, continuous monitoring, hardening and forensics readiness. We have Prowler CLI (Command Line Interface) that we call Prowler Open Source and a service on top of it that we call <a href="https://prowler.com">Prowler SaaS</a>.
|
||||
|
||||
# Prowler Documentation
|
||||
|
||||
**Welcome to [Prowler Open Source v3](https://github.com/prowler-cloud/prowler/) Documentation!** 📄
|
||||
|
||||
For **Prowler v2 Documentation**, please go [here](https://github.com/prowler-cloud/prowler/tree/2.12.0) to the branch and its README.md.
|
||||
|
||||
- You are currently in the **Getting Started** section where you can find general information and requirements to help you start with the tool.
|
||||
- In the [Tutorials](tutorials/overview) section you will see how to take advantage of all the features in Prowler.
|
||||
- In the [Contact Us](contact) section you can find how to reach us out in case of technical issues.
|
||||
- In the [About](about) section you will find more information about the Prowler team and license.
|
||||
|
||||
## About Prowler
|
||||
|
||||
**Prowler** is an Open Source security tool to perform AWS, Azure and Google Cloud security best practices assessments, audits, incident response, continuous monitoring, hardening and forensics readiness.
|
||||
|
||||
It contains hundreds of controls covering CIS, PCI-DSS, ISO27001, GDPR, HIPAA, FFIEC, SOC2, AWS FTR, ENS and custom security frameworks.
|
||||
|
||||
[](https://twitter.com/prowlercloud)
|
||||
|
||||
## About ProwlerPro
|
||||
|
||||
<a href="https://prowler.pro"><img align="right" src="./img/prowler-pro-light.png" width="350"></a> **ProwlerPro** gives you the benefits of Prowler Open Source plus continuous monitoring, faster execution, personalized support, visualization of your data with dashboards, alerts and much more.
|
||||
Visit <a href="https://prowler.pro">prowler.pro</a> for more info.
|
||||

|
||||
|
||||
Prowler offers hundreds of controls covering more than 25 standards and compliance frameworks like CIS, PCI-DSS, ISO27001, GDPR, HIPAA, FFIEC, SOC2, AWS FTR, ENS and custom security frameworks.
|
||||
|
||||
## Quick Start
|
||||
### Installation
|
||||
|
||||
Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-cloud/), thus can be installed using pip with `Python >= 3.9`:
|
||||
|
||||
Prowler is available as a project in [PyPI](https://pypi.org/project/prowler/), thus can be installed using pip with `Python >= 3.9`:
|
||||
|
||||
=== "Generic"
|
||||
|
||||
@@ -40,7 +15,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
|
||||
|
||||
* `Python >= 3.9`
|
||||
* `Python pip >= 3.9`
|
||||
* AWS, GCP and/or Azure credentials
|
||||
* AWS, GCP, Azure and/or Kubernetes credentials
|
||||
|
||||
_Commands_:
|
||||
|
||||
@@ -54,7 +29,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
|
||||
_Requirements_:
|
||||
|
||||
* Have `docker` installed: https://docs.docker.com/get-docker/.
|
||||
* AWS, GCP and/or Azure credentials
|
||||
* AWS, GCP, Azure and/or Kubernetes credentials
|
||||
* In the command below, change `-v` to your local directory path in order to access the reports.
|
||||
|
||||
_Commands_:
|
||||
@@ -71,7 +46,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
|
||||
|
||||
_Requirements for Ubuntu 20.04.3 LTS_:
|
||||
|
||||
* AWS, GCP and/or Azure credentials
|
||||
* AWS, GCP, Azure and/or Kubernetes credentials
|
||||
* Install python 3.9 with: `sudo apt-get install python3.9`
|
||||
* Remove python 3.8 to avoid conflicts if you can: `sudo apt-get remove python3.8`
|
||||
* Make sure you have the python3 distutils package installed: `sudo apt-get install python3-distutils`
|
||||
@@ -91,7 +66,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
|
||||
|
||||
_Requirements for Developers_:
|
||||
|
||||
* AWS, GCP and/or Azure credentials
|
||||
* AWS, GCP, Azure and/or Kubernetes credentials
|
||||
* `git`, `Python >= 3.9`, `pip` and `poetry` installed (`pip install poetry`)
|
||||
|
||||
_Commands_:
|
||||
@@ -108,7 +83,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
|
||||
|
||||
_Requirements_:
|
||||
|
||||
* AWS, GCP and/or Azure credentials
|
||||
* AWS, GCP, Azure and/or Kubernetes credentials
|
||||
* Latest Amazon Linux 2 should come with Python 3.9 already installed however it may need pip. Install Python pip 3.9 with: `sudo yum install -y python3-pip`.
|
||||
* Make sure setuptools for python is already installed with: `pip3 install setuptools`
|
||||
|
||||
@@ -125,7 +100,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
|
||||
_Requirements_:
|
||||
|
||||
* `Brew` installed in your Mac or Linux
|
||||
* AWS, GCP and/or Azure credentials
|
||||
* AWS, GCP, Azure and/or Kubernetes credentials
|
||||
|
||||
_Commands_:
|
||||
|
||||
@@ -136,30 +111,25 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
|
||||
|
||||
=== "AWS CloudShell"
|
||||
|
||||
Prowler can be easely executed in AWS CloudShell but it has some prerequsites to be able to to so. AWS CloudShell is a container running with `Amazon Linux release 2 (Karoo)` that comes with Python 3.7, since Prowler requires Python >= 3.9 we need to first install a newer version of Python. Follow the steps below to successfully execute Prowler v3 in AWS CloudShell:
|
||||
After the migration of AWS CloudShell from Amazon Linux 2 to Amazon Linux 2023 [[1]](https://aws.amazon.com/about-aws/whats-new/2023/12/aws-cloudshell-migrated-al2023/) [2](https://docs.aws.amazon.com/cloudshell/latest/userguide/cloudshell-AL2023-migration.html), there is no longer a need to manually compile Python 3.9 as it's already included in AL2023. Prowler can thus be easily installed following the Generic method of installation via pip. Follow the steps below to successfully execute Prowler v4 in AWS CloudShell:
|
||||
|
||||
_Requirements_:
|
||||
|
||||
* First install all dependences and then Python, in this case we need to compile it because there is not a package available at the time this document is written:
|
||||
```
|
||||
sudo yum -y install gcc openssl-devel bzip2-devel libffi-devel
|
||||
wget https://www.python.org/ftp/python/3.9.16/Python-3.9.16.tgz
|
||||
tar zxf Python-3.9.16.tgz
|
||||
cd Python-3.9.16/
|
||||
./configure --enable-optimizations
|
||||
sudo make altinstall
|
||||
python3.9 --version
|
||||
cd
|
||||
```
|
||||
* Open AWS CloudShell `bash`.
|
||||
|
||||
_Commands_:
|
||||
|
||||
* Once Python 3.9 is available we can install Prowler from pip:
|
||||
```
|
||||
pip3.9 install prowler
|
||||
prowler -v
|
||||
sudo bash
|
||||
adduser prowler
|
||||
su prowler
|
||||
pip install prowler
|
||||
cd /tmp || exit
|
||||
prowler aws
|
||||
```
|
||||
|
||||
> To download the results from AWS CloudShell, select Actions -> Download File and add the full path of each file. For the CSV file it will be something like `/home/cloudshell-user/output/prowler-output-123456789012-20221220191331.csv`
|
||||
???+ note
|
||||
To download the results from AWS CloudShell, select Actions -> Download File and add the full path of each file. For the CSV file it will be something like `/home/cloudshell-user/output/prowler-output-123456789012-20221220191331.csv`
|
||||
|
||||
=== "Azure CloudShell"
|
||||
|
||||
@@ -194,16 +164,20 @@ You can run Prowler from your workstation, an EC2 instance, Fargate or any other
|
||||

|
||||
## Basic Usage
|
||||
|
||||
To run Prowler, you will need to specify the provider (e.g aws, gcp or azure):
|
||||
> If no provider specified, AWS will be used for backward compatibility with most of v2 options.
|
||||
To run Prowler, you will need to specify the provider (e.g `aws`, `gcp`, `azure` or `kubernetes`):
|
||||
|
||||
???+ note
|
||||
If no provider specified, AWS will be used for backward compatibility with most of v2 options.
|
||||
|
||||
```console
|
||||
prowler <provider>
|
||||
```
|
||||

|
||||
> Running the `prowler` command without options will use your environment variable credentials, see [Requirements](getting-started/requirements/) section to review the credentials settings.
|
||||
|
||||
If you miss the former output you can use `--verbose` but Prowler v3 is smoking fast, so you won't see much ;)
|
||||
???+ note
|
||||
Running the `prowler` command without options will use your environment variable credentials, see [Requirements](./getting-started/requirements.md) section to review the credentials settings.
|
||||
|
||||
If you miss the former output you can use `--verbose` but Prowler v4 is smoking fast, so you won't see much ;
|
||||
|
||||
By default, Prowler will generate a CSV, JSON and HTML reports, however you can generate a JSON-ASFF (used by AWS Security Hub) report with `-M` or `--output-modes`:
|
||||
|
||||
@@ -227,6 +201,7 @@ For executing specific checks or services you can use options `-c`/`checks` or `
|
||||
prowler azure --checks storage_blob_public_access_level_is_disabled
|
||||
prowler aws --services s3 ec2
|
||||
prowler gcp --services iam compute
|
||||
prowler kubernetes --services etcd apiserver
|
||||
```
|
||||
|
||||
Also, checks and services can be excluded with options `-e`/`--excluded-checks` or `--excluded-services`:
|
||||
@@ -235,6 +210,7 @@ Also, checks and services can be excluded with options `-e`/`--excluded-checks`
|
||||
prowler aws --excluded-checks s3_bucket_public_access
|
||||
prowler azure --excluded-services defender iam
|
||||
prowler gcp --excluded-services kms
|
||||
prowler kubernetes --excluded-services controllermanager
|
||||
```
|
||||
|
||||
More options and executions methods that will save your time in [Miscellaneous](tutorials/misc.md).
|
||||
@@ -252,7 +228,9 @@ Use a custom AWS profile with `-p`/`--profile` and/or AWS regions which you want
|
||||
```console
|
||||
prowler aws --profile custom-profile -f us-east-1 eu-south-2
|
||||
```
|
||||
> By default, `prowler` will scan all AWS regions.
|
||||
|
||||
???+ note
|
||||
By default, `prowler` will scan all AWS regions.
|
||||
|
||||
See more details about AWS Authentication in [Requirements](getting-started/requirements.md)
|
||||
|
||||
@@ -302,3 +280,27 @@ prowler gcp --project-ids <Project ID 1> <Project ID 2> ... <Project ID N>
|
||||
```
|
||||
|
||||
See more details about GCP Authentication in [Requirements](getting-started/requirements.md)
|
||||
|
||||
## Kubernetes
|
||||
|
||||
Prowler allows you to scan your Kubernetes Cluster either from within the cluster or from outside the cluster.
|
||||
|
||||
For non in-cluster execution, you can provide the location of the KubeConfig file with the following argument:
|
||||
|
||||
```console
|
||||
prowler kubernetes --kubeconfig-file path
|
||||
```
|
||||
|
||||
For in-cluster execution, you can use the supplied yaml to run Prowler as a job:
|
||||
```console
|
||||
kubectl apply -f job.yaml
|
||||
kubectl apply -f prowler-role.yaml
|
||||
kubectl apply -f prowler-rolebinding.yaml
|
||||
kubectl get pods --> prowler-XXXXX
|
||||
kubectl logs prowler-XXXXX
|
||||
```
|
||||
|
||||
> By default, `prowler` will scan all namespaces in your active Kubernetes context, use flag `--context` to specify the context to be scanned and `--namespaces` to specify the namespaces to be scanned.
|
||||
|
||||
## Prowler v2 Documentation
|
||||
For **Prowler v2 Documentation**, please check it out [here](https://github.com/prowler-cloud/prowler/blob/8818f47333a0c1c1a457453c87af0ea5b89a385f/README.md).
|
||||
|
||||
@@ -13,9 +13,9 @@ As an **AWS Partner** and we have passed the [AWS Foundation Technical Review (F
|
||||
|
||||
## Reporting Vulnerabilities
|
||||
|
||||
If you would like to report a vulnerability or have a security concern regarding Prowler Open Source or ProwlerPro service, please submit the information by contacting to help@prowler.pro.
|
||||
If you would like to report a vulnerability or have a security concern regarding Prowler Open Source or Prowler SaaS service, please submit the information by contacting to us via [**support.prowler.com**](http://support.prowler.com).
|
||||
|
||||
The information you share with Verica as part of this process is kept confidential within Verica and the Prowler team. We will only share this information with a third party if the vulnerability you report is found to affect a third-party product, in which case we will share this information with the third-party product's author or manufacturer. Otherwise, we will only share this information as permitted by you.
|
||||
The information you share with the Prowler team as part of this process is kept confidential within Prowler. We will only share this information with a third party if the vulnerability you report is found to affect a third-party product, in which case we will share this information with the third-party product's author or manufacturer. Otherwise, we will only share this information as permitted by you.
|
||||
|
||||
We will review the submitted report, and assign it a tracking number. We will then respond to you, acknowledging receipt of the report, and outline the next steps in the process.
|
||||
|
||||
|
||||
@@ -11,4 +11,4 @@
|
||||
This error is also related with a lack of system requirements. To improve performance, Prowler stores information in memory so it may need to be run in a system with more than 1GB of memory.
|
||||
|
||||
|
||||
See section [Logging](/tutorials/logging/) for further information or [contact us](/contact/).
|
||||
See section [Logging](./tutorials/logging.md) for further information or [contact us](./contact.md).
|
||||
|
||||
@@ -16,12 +16,19 @@ export AWS_SESSION_TOKEN="XXXXXXXXX"
|
||||
|
||||
Those credentials must be associated to a user or role with proper permissions to do all checks. To make sure, add the following AWS managed policies to the user or role being used:
|
||||
|
||||
- arn:aws:iam::aws:policy/SecurityAudit
|
||||
- arn:aws:iam::aws:policy/job-function/ViewOnlyAccess
|
||||
- `arn:aws:iam::aws:policy/SecurityAudit`
|
||||
- `arn:aws:iam::aws:policy/job-function/ViewOnlyAccess`
|
||||
|
||||
> Moreover, some read-only additional permissions are needed for several checks, make sure you attach also the custom policy [prowler-additions-policy.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-additions-policy.json) to the role you are using.
|
||||
???+ note
|
||||
Moreover, some read-only additional permissions are needed for several checks, make sure you attach also the custom policy [prowler-additions-policy.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-additions-policy.json) to the role you are using. If you want Prowler to send findings to [AWS Security Hub](https://aws.amazon.com/security-hub), make sure you also attach the custom policy [prowler-security-hub.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-security-hub.json).
|
||||
|
||||
> If you want Prowler to send findings to [AWS Security Hub](https://aws.amazon.com/security-hub), make sure you also attach the custom policy [prowler-security-hub.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-security-hub.json).
|
||||
|
||||
## Profiles
|
||||
|
||||
Prowler can use your custom AWS Profile with:
|
||||
```console
|
||||
prowler <provider> -p/--profile <profile_name>
|
||||
```
|
||||
|
||||
## Multi-Factor Authentication
|
||||
|
||||
@@ -29,7 +36,3 @@ If your IAM entity enforces MFA you can use `--mfa` and Prowler will ask you to
|
||||
|
||||
- ARN of your MFA device
|
||||
- TOTP (Time-Based One-Time Password)
|
||||
|
||||
## STS Endpoint Region
|
||||
|
||||
If you are using Prowler in AWS regions that are not enabled by default you need to use the argument `--sts-endpoint-region` to point the AWS STS API calls `assume-role` and `get-caller-identity` to the non-default region, e.g.: `prowler aws --sts-endpoint-region eu-south-2`.
|
||||
|
||||
@@ -32,3 +32,14 @@ Prowler's AWS Provider uses the Boto3 [Standard](https://boto3.amazonaws.com/v1/
|
||||
- Retry attempts on nondescriptive, transient error codes. Specifically, these HTTP status codes: 500, 502, 503, 504.
|
||||
|
||||
- Any retry attempt will include an exponential backoff by a base factor of 2 for a maximum backoff time of 20 seconds.
|
||||
|
||||
## Notes for validating retry attempts
|
||||
|
||||
If you are making changes to Prowler, and want to validate if requests are being retried or given up on, you can take the following approach
|
||||
|
||||
* Run prowler with `--log-level DEBUG` and `--log-file debuglogs.txt`
|
||||
* Search for retry attempts using `grep -i 'Retry needed' debuglogs.txt`
|
||||
|
||||
This is based off of the [AWS documentation](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/retries.html#checking-retry-attempts-in-your-client-logs), which states that if a retry is performed, you will see a message starting with "Retry needed".
|
||||
|
||||
You can determine the total number of calls made using `grep -i 'Sending http request' debuglogs.txt | wc -l`
|
||||
|
||||
@@ -1,26 +1,33 @@
|
||||
# AWS CloudShell
|
||||
|
||||
Prowler can be easily executed in AWS CloudShell but it has some prerequisites to be able to to so. AWS CloudShell is a container running with `Amazon Linux release 2 (Karoo)` that comes with Python 3.7, since Prowler requires Python >= 3.9 we need to first install a newer version of Python. Follow the steps below to successfully execute Prowler v3 in AWS CloudShell:
|
||||
|
||||
- First install all dependences and then Python, in this case we need to compile it because there is not a package available at the time this document is written:
|
||||
```
|
||||
sudo yum -y install gcc openssl-devel bzip2-devel libffi-devel
|
||||
wget https://www.python.org/ftp/python/3.9.16/Python-3.9.16.tgz
|
||||
tar zxf Python-3.9.16.tgz
|
||||
cd Python-3.9.16/
|
||||
./configure --enable-optimizations
|
||||
sudo make altinstall
|
||||
python3.9 --version
|
||||
cd
|
||||
```
|
||||
- Once Python 3.9 is available we can install Prowler from pip:
|
||||
```
|
||||
pip3.9 install prowler
|
||||
```
|
||||
- Now enjoy Prowler:
|
||||
```
|
||||
prowler -v
|
||||
prowler
|
||||
## Installation
|
||||
After the migration of AWS CloudShell from Amazon Linux 2 to Amazon Linux 2023 [[1]](https://aws.amazon.com/about-aws/whats-new/2023/12/aws-cloudshell-migrated-al2023/) [[2]](https://docs.aws.amazon.com/cloudshell/latest/userguide/cloudshell-AL2023-migration.html), there is no longer a need to manually compile Python 3.9 as it's already included in AL2023. Prowler can thus be easily installed following the Generic method of installation via pip. Follow the steps below to successfully execute Prowler v4 in AWS CloudShell:
|
||||
```shell
|
||||
sudo bash
|
||||
adduser prowler
|
||||
su prowler
|
||||
pip install prowler
|
||||
cd /tmp || exit
|
||||
prowler aws
|
||||
```
|
||||
|
||||
- To download the results from AWS CloudShell, select Actions -> Download File and add the full path of each file. For the CSV file it will be something like `/home/cloudshell-user/output/prowler-output-123456789012-20221220191331.csv`
|
||||
## Download Files
|
||||
|
||||
To download the results from AWS CloudShell, select Actions -> Download File and add the full path of each file. For the CSV file it will be something like `/home/cloudshell-user/output/prowler-output-123456789012-20221220191331.csv`
|
||||
|
||||
## Clone Prowler from Github
|
||||
|
||||
The limited storage that AWS CloudShell provides for the user's home directory causes issues when installing the poetry dependencies to run Prowler from GitHub. Here is a workaround:
|
||||
```shell
|
||||
sudo bash
|
||||
adduser prowler
|
||||
su prowler
|
||||
git clone https://github.com/prowler-cloud/prowler.git
|
||||
cd prowler
|
||||
pip install poetry
|
||||
mkdir /tmp/poetry
|
||||
poetry config cache-dir /tmp/poetry
|
||||
poetry shell
|
||||
poetry install
|
||||
python prowler.py -v
|
||||
```
|
||||
|
||||
BIN
docs/tutorials/aws/img/enable-2.png
Normal file
|
After Width: | Height: | Size: 341 KiB |
BIN
docs/tutorials/aws/img/enable-partner-integration-2.png
Normal file
|
After Width: | Height: | Size: 291 KiB |
BIN
docs/tutorials/aws/img/enable-partner-integration-3.png
Normal file
|
After Width: | Height: | Size: 306 KiB |