Compare commits
395 Commits
5.5.0
...
improve-de
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
01fd759f2a | ||
|
|
07b9e1d3a4 | ||
|
|
96a879d761 | ||
|
|
283127c3f4 | ||
|
|
beeee80a0b | ||
|
|
06b62826b4 | ||
|
|
d0736af209 | ||
|
|
716c8c1a5f | ||
|
|
e6cdda1bd9 | ||
|
|
2747a633bc | ||
|
|
74118f5cfe | ||
|
|
598bdf28bb | ||
|
|
d75f681c87 | ||
|
|
c7956ede6a | ||
|
|
64f5a69e84 | ||
|
|
bfb15c34b8 | ||
|
|
638b3ac0cd | ||
|
|
9d6147a037 | ||
|
|
802c786ac2 | ||
|
|
c8be8dbd9a | ||
|
|
7053b2bb37 | ||
|
|
447bf832cd | ||
|
|
7c4571b55e | ||
|
|
eb7c16aba5 | ||
|
|
b09e83b171 | ||
|
|
bb149a30a7 | ||
|
|
d5be35af49 | ||
|
|
f6aa56d92b | ||
|
|
6a4df15c47 | ||
|
|
72de5fdb1b | ||
|
|
a7f55d06af | ||
|
|
97da78d4e7 | ||
|
|
c4f6161c73 | ||
|
|
db7ffea24d | ||
|
|
489b5abf82 | ||
|
|
3a55c2ee07 | ||
|
|
64d866271c | ||
|
|
1ab2a80eab | ||
|
|
89d4c521ba | ||
|
|
f2e19d377a | ||
|
|
2b7b887b87 | ||
|
|
44c70b5d01 | ||
|
|
7514484c42 | ||
|
|
9594c4c99f | ||
|
|
56445c9753 | ||
|
|
07419fd5e1 | ||
|
|
2e4dd12b41 | ||
|
|
fed2046c49 | ||
|
|
db79db4786 | ||
|
|
6f027e3c57 | ||
|
|
bdb877009f | ||
|
|
6564ec1ff5 | ||
|
|
443dc067b3 | ||
|
|
6221650c5f | ||
|
|
034d0fd1f4 | ||
|
|
e617ff0460 | ||
|
|
4b1ed607a7 | ||
|
|
137365a670 | ||
|
|
1891a1b24f | ||
|
|
e57e070866 | ||
|
|
66998cd1ad | ||
|
|
c0b1833446 | ||
|
|
329a72c77c | ||
|
|
2610ee9d0c | ||
|
|
a13ca9034e | ||
|
|
5d1abb3689 | ||
|
|
e1d1c6d154 | ||
|
|
e18e0e7cd4 | ||
|
|
eaf3d07a3f | ||
|
|
c88ae32b7f | ||
|
|
605613e220 | ||
|
|
d2772000ec | ||
|
|
42939a79f5 | ||
|
|
ed17931117 | ||
|
|
66df5f7a1c | ||
|
|
fc6e6696e5 | ||
|
|
465748c8a1 | ||
|
|
e59cd71bbf | ||
|
|
8a76fea310 | ||
|
|
0e46be54ec | ||
|
|
dc81813fdf | ||
|
|
eaa0df16bb | ||
|
|
c23e911028 | ||
|
|
06b96a1007 | ||
|
|
fa545c591f | ||
|
|
e828b780c7 | ||
|
|
eca8c5cabd | ||
|
|
b7bce6008f | ||
|
|
2fdf89883d | ||
|
|
6c5d4bbaaa | ||
|
|
cb2f926d4f | ||
|
|
12c01b437e | ||
|
|
3253a58942 | ||
|
|
199f7f14ea | ||
|
|
d42406d765 | ||
|
|
2276ffb1f6 | ||
|
|
218fb3afb0 | ||
|
|
a9fb890979 | ||
|
|
54ebf5b455 | ||
|
|
c9a0475aa8 | ||
|
|
5567d9f88c | ||
|
|
56f3e661ae | ||
|
|
1aa4479a10 | ||
|
|
7b625d0a91 | ||
|
|
fd0529529d | ||
|
|
af43191954 | ||
|
|
2ce2ca7c91 | ||
|
|
a0fc3db665 | ||
|
|
feb458027f | ||
|
|
e5a5b7af5c | ||
|
|
ad456ae2fe | ||
|
|
690cb51f6c | ||
|
|
14aaa2f376 | ||
|
|
6e47ca2c41 | ||
|
|
0d99d2be9b | ||
|
|
c322ef00e7 | ||
|
|
3513421225 | ||
|
|
b0e6bfbefe | ||
|
|
f7a918730e | ||
|
|
cef33319c5 | ||
|
|
2036a59210 | ||
|
|
e5eccb6227 | ||
|
|
48c2c8567c | ||
|
|
bbeef0299f | ||
|
|
bec5584d63 | ||
|
|
bdc759d34c | ||
|
|
8db442d8ba | ||
|
|
9e7a0d4175 | ||
|
|
9c33b3f5a9 | ||
|
|
7e7e2c87dc | ||
|
|
2f741f35a8 | ||
|
|
c411466df7 | ||
|
|
9679939307 | ||
|
|
8539423b22 | ||
|
|
81edafdf09 | ||
|
|
e0a262882a | ||
|
|
89237ab99e | ||
|
|
0f414e451e | ||
|
|
1180522725 | ||
|
|
81c7ebf123 | ||
|
|
258f05e6f4 | ||
|
|
53efb1c153 | ||
|
|
26014a9705 | ||
|
|
00ef037e45 | ||
|
|
669ec74e67 | ||
|
|
c4528200b0 | ||
|
|
ba7cd0250a | ||
|
|
c5e97678a1 | ||
|
|
337a46cdcc | ||
|
|
7f74b67f1f | ||
|
|
5dcc48d2e5 | ||
|
|
8b04aab07d | ||
|
|
eab4f6cf2e | ||
|
|
7f8d623283 | ||
|
|
dbffed8f1f | ||
|
|
7e3688fdd0 | ||
|
|
2e111e9ad3 | ||
|
|
6d6070ff3f | ||
|
|
391bbde353 | ||
|
|
3c56eb3762 | ||
|
|
7c14ea354b | ||
|
|
c96aad0b77 | ||
|
|
a9dd3e424b | ||
|
|
8a144a4046 | ||
|
|
75f86d7267 | ||
|
|
bbf875fc2f | ||
|
|
59d491f61b | ||
|
|
ed640a1324 | ||
|
|
e86fbcaef7 | ||
|
|
7f48212054 | ||
|
|
a2c5c71baf | ||
|
|
b904f81cb9 | ||
|
|
d64fe374dd | ||
|
|
fe25e7938e | ||
|
|
931df361bf | ||
|
|
d7c45f4aee | ||
|
|
5e5bef581b | ||
|
|
2d9e95d812 | ||
|
|
e5f979d106 | ||
|
|
c7a5815203 | ||
|
|
03e268722e | ||
|
|
78a2774329 | ||
|
|
c1b5ab7f53 | ||
|
|
b861d97ad4 | ||
|
|
f3abcc9dd6 | ||
|
|
cab13fe018 | ||
|
|
cc4b19c7ce | ||
|
|
a754d9aee5 | ||
|
|
22b54b2d8d | ||
|
|
d12ca6301a | ||
|
|
bc1b2ad9ab | ||
|
|
1782ab1514 | ||
|
|
0384fc50e3 | ||
|
|
cc46dee9ee | ||
|
|
ed5a0ae45a | ||
|
|
928ccfefb8 | ||
|
|
7f6bfb7b3e | ||
|
|
bcbc9bf675 | ||
|
|
0ec4366f4c | ||
|
|
ff72b7eea1 | ||
|
|
a32ca19251 | ||
|
|
b79508956a | ||
|
|
d76c5bd658 | ||
|
|
580e11126c | ||
|
|
736d40546a | ||
|
|
88810d2bb5 | ||
|
|
3a8f4d2ffb | ||
|
|
1fe125a65f | ||
|
|
0ff4df0836 | ||
|
|
16b4775e2d | ||
|
|
c3a13b8a29 | ||
|
|
d1053375b7 | ||
|
|
0fa4538256 | ||
|
|
738644f288 | ||
|
|
2f80b055ac | ||
|
|
fd62a1df10 | ||
|
|
a85d0ebd0a | ||
|
|
2c06902baa | ||
|
|
76ac6429fe | ||
|
|
43cae66b0d | ||
|
|
dacddecc7d | ||
|
|
dcb9267c2f | ||
|
|
ff35fd90fa | ||
|
|
7469377079 | ||
|
|
c8441f8d38 | ||
|
|
abf4eb0ffc | ||
|
|
93717cc830 | ||
|
|
b629bc81f8 | ||
|
|
f628897fe1 | ||
|
|
54b82a78e3 | ||
|
|
377faf145f | ||
|
|
69e316948f | ||
|
|
62cbff4f53 | ||
|
|
5582265e9d | ||
|
|
fb5ea3c324 | ||
|
|
9b5f676f50 | ||
|
|
88cfc0fa7e | ||
|
|
665bfa2f13 | ||
|
|
b89b1a64f4 | ||
|
|
9ba657c261 | ||
|
|
bce958b8e6 | ||
|
|
914012de2b | ||
|
|
8d1c476aed | ||
|
|
567c729e9e | ||
|
|
3f03dd20e4 | ||
|
|
1c778354da | ||
|
|
3a149fa459 | ||
|
|
f3b121950d | ||
|
|
43c13b7ba1 | ||
|
|
9447b33800 | ||
|
|
2934752eeb | ||
|
|
dd6d8c71fd | ||
|
|
80267c389b | ||
|
|
acfbaf75d5 | ||
|
|
5f54377407 | ||
|
|
552aa64741 | ||
|
|
d64f611f51 | ||
|
|
a96cc92d77 | ||
|
|
3858cccc41 | ||
|
|
072828512a | ||
|
|
a73ffe5642 | ||
|
|
8e784a5b6d | ||
|
|
1b6f9332f1 | ||
|
|
db8b472729 | ||
|
|
867b371522 | ||
|
|
c0d7c9fc7d | ||
|
|
bb4685cf90 | ||
|
|
6a95426749 | ||
|
|
ef6af8e84d | ||
|
|
763130f253 | ||
|
|
1256c040e9 | ||
|
|
18b7b48a99 | ||
|
|
627c11503f | ||
|
|
712ba84f06 | ||
|
|
5186e029b3 | ||
|
|
5bfaedf903 | ||
|
|
5061da6897 | ||
|
|
c159a28016 | ||
|
|
82a1b1c921 | ||
|
|
bf2210d0f4 | ||
|
|
8f0772cb94 | ||
|
|
5b57079ecd | ||
|
|
350d759517 | ||
|
|
edd793c9f5 | ||
|
|
545c2dc685 | ||
|
|
84955c066c | ||
|
|
06dd03b170 | ||
|
|
47bc2ed2dc | ||
|
|
44281afc54 | ||
|
|
4d2859d145 | ||
|
|
45d44a1669 | ||
|
|
ddd83b340e | ||
|
|
ccdb54d7c3 | ||
|
|
bcc246d950 | ||
|
|
62139e252a | ||
|
|
86950c3a0a | ||
|
|
f4865ef68d | ||
|
|
ea7209e7ae | ||
|
|
998c551cf3 | ||
|
|
e6f29b0116 | ||
|
|
eb90bb39dc | ||
|
|
ad189b35ad | ||
|
|
7d2989a233 | ||
|
|
862137ae7d | ||
|
|
c86e082d9a | ||
|
|
80fe048f97 | ||
|
|
f2bffb3ce7 | ||
|
|
cbe2f9eef8 | ||
|
|
688f41f570 | ||
|
|
a29197637e | ||
|
|
7a2712a37f | ||
|
|
189f5cfd8c | ||
|
|
e509480892 | ||
|
|
7f7955351a | ||
|
|
46f1db21a8 | ||
|
|
fbe7bc6951 | ||
|
|
f658507847 | ||
|
|
374078683b | ||
|
|
114c4e0886 | ||
|
|
67c62766d4 | ||
|
|
3f2947158d | ||
|
|
278a7cb356 | ||
|
|
890158a79c | ||
|
|
4dc1602b77 | ||
|
|
bbba0abac9 | ||
|
|
d04fd807c6 | ||
|
|
3456df4cf1 | ||
|
|
f56aaa791e | ||
|
|
465a758770 | ||
|
|
0f7c0c1b2c | ||
|
|
bf8d10b6f6 | ||
|
|
20d04553d6 | ||
|
|
b56d62e3c4 | ||
|
|
9a332dcba1 | ||
|
|
166d9f8823 | ||
|
|
42f5eed75f | ||
|
|
01a7db18dd | ||
|
|
d4507465a3 | ||
|
|
3ac92ed10a | ||
|
|
43c76ca85c | ||
|
|
54d87fa96a | ||
|
|
f041f17268 | ||
|
|
31c80a6967 | ||
|
|
783ce136f4 | ||
|
|
f829145781 | ||
|
|
389337f8cd | ||
|
|
a0713c2d66 | ||
|
|
f94d3cbce4 | ||
|
|
8d8994b468 | ||
|
|
784a9097a5 | ||
|
|
b9601626e3 | ||
|
|
dc80b011f2 | ||
|
|
ee7d32d460 | ||
|
|
43fd9ee94e | ||
|
|
8821a91f3f | ||
|
|
98d9256f92 | ||
|
|
b35495eaa7 | ||
|
|
74d6b614b3 | ||
|
|
dd63c16a74 | ||
|
|
4280266a96 | ||
|
|
b1f02098ff | ||
|
|
95189b574a | ||
|
|
c5d23503bf | ||
|
|
77950f6069 | ||
|
|
ec5f2b3753 | ||
|
|
9e7104fb7f | ||
|
|
6b3b6ca45e | ||
|
|
20b8b0b24e | ||
|
|
4e11540458 | ||
|
|
ee87f2676d | ||
|
|
74a90aab98 | ||
|
|
48ff9a5100 | ||
|
|
3dfd578ee5 | ||
|
|
0db46cdc81 | ||
|
|
fdac58d031 | ||
|
|
df9d4ce856 | ||
|
|
e6ae4e97e8 | ||
|
|
10a4c28922 | ||
|
|
8a828c6e51 | ||
|
|
d7b40905ff | ||
|
|
f9a3b5f3cd | ||
|
|
b73b89242f | ||
|
|
23a0f6e8de | ||
|
|
87967abc3f | ||
|
|
ce60c286dc | ||
|
|
90fd9b0eb8 | ||
|
|
ca262a6797 | ||
|
|
c056d39775 | ||
|
|
1c4426ea4b | ||
|
|
36520bd7a1 | ||
|
|
badf0ace76 | ||
|
|
f1f61249e0 | ||
|
|
b371cac18c | ||
|
|
1846535d8d | ||
|
|
d7d9118b9b |
4
.github/workflows/api-codeql.yml
vendored
@@ -48,12 +48,12 @@ jobs:
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
|
||||
uses: github/codeql-action/init@5f8171a638ada777af81d42b55959a643bb29017 # v3.28.12
|
||||
with:
|
||||
languages: ${{ matrix.language }}
|
||||
config-file: ./.github/codeql/api-codeql-config.yml
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
|
||||
uses: github/codeql-action/analyze@5f8171a638ada777af81d42b55959a643bb29017 # v3.28.12
|
||||
with:
|
||||
category: "/language:${{matrix.language}}"
|
||||
|
||||
4
.github/workflows/api-pull-request.yml
vendored
@@ -75,7 +75,7 @@ jobs:
|
||||
|
||||
- name: Test if changes are in not ignored paths
|
||||
id: are-non-ignored-files-changed
|
||||
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
|
||||
uses: tj-actions/changed-files@2f7c5bfce28377bc069a65ba478de0a74aa0ca32 # v46.0.1
|
||||
with:
|
||||
files: api/**
|
||||
files_ignore: |
|
||||
@@ -94,7 +94,7 @@ jobs:
|
||||
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
|
||||
uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5.4.0
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
cache: "poetry"
|
||||
|
||||
2
.github/workflows/find-secrets.yml
vendored
@@ -11,7 +11,7 @@ jobs:
|
||||
with:
|
||||
fetch-depth: 0
|
||||
- name: TruffleHog OSS
|
||||
uses: trufflesecurity/trufflehog@690e5c7aff8347c3885096f3962a0633d9129607 # v3.88.23
|
||||
uses: trufflesecurity/trufflehog@ded5f45b92c00939718787ce586b520bbe795f3b # v3.88.18
|
||||
with:
|
||||
path: ./
|
||||
base: ${{ github.event.repository.default_branch }}
|
||||
|
||||
@@ -62,13 +62,13 @@ jobs:
|
||||
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
|
||||
- name: Setup Python
|
||||
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
|
||||
uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5.4.0
|
||||
with:
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
|
||||
- name: Install Poetry
|
||||
run: |
|
||||
pipx install poetry==2.*
|
||||
pipx install poetry==1.8.5
|
||||
pipx inject poetry poetry-bumpversion
|
||||
|
||||
- name: Get Prowler version
|
||||
|
||||
4
.github/workflows/sdk-codeql.yml
vendored
@@ -54,12 +54,12 @@ jobs:
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
|
||||
uses: github/codeql-action/init@5f8171a638ada777af81d42b55959a643bb29017 # v3.28.12
|
||||
with:
|
||||
languages: ${{ matrix.language }}
|
||||
config-file: ./.github/codeql/sdk-codeql-config.yml
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
|
||||
uses: github/codeql-action/analyze@5f8171a638ada777af81d42b55959a643bb29017 # v3.28.12
|
||||
with:
|
||||
category: "/language:${{matrix.language}}"
|
||||
|
||||
4
.github/workflows/sdk-pull-request.yml
vendored
@@ -25,7 +25,7 @@ jobs:
|
||||
|
||||
- name: Test if changes are in not ignored paths
|
||||
id: are-non-ignored-files-changed
|
||||
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
|
||||
uses: tj-actions/changed-files@2f7c5bfce28377bc069a65ba478de0a74aa0ca32 # v46.0.1
|
||||
with:
|
||||
files: ./**
|
||||
files_ignore: |
|
||||
@@ -50,7 +50,7 @@ jobs:
|
||||
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
|
||||
uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5.4.0
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
cache: "poetry"
|
||||
|
||||
4
.github/workflows/sdk-pypi-release.yml
vendored
@@ -68,10 +68,10 @@ jobs:
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
pipx install poetry==2.1.1
|
||||
pipx install poetry==1.8.5
|
||||
|
||||
- name: Setup Python
|
||||
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
|
||||
uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5.4.0
|
||||
with:
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
cache: ${{ env.CACHE }}
|
||||
|
||||
@@ -28,7 +28,7 @@ jobs:
|
||||
ref: ${{ env.GITHUB_BRANCH }}
|
||||
|
||||
- name: setup python
|
||||
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
|
||||
uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5.4.0
|
||||
with:
|
||||
python-version: 3.9 #install the python needed
|
||||
|
||||
|
||||
4
.github/workflows/ui-codeql.yml
vendored
@@ -48,12 +48,12 @@ jobs:
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
|
||||
uses: github/codeql-action/init@5f8171a638ada777af81d42b55959a643bb29017 # v3.28.12
|
||||
with:
|
||||
languages: ${{ matrix.language }}
|
||||
config-file: ./.github/codeql/ui-codeql-config.yml
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
|
||||
uses: github/codeql-action/analyze@5f8171a638ada777af81d42b55959a643bb29017 # v3.28.12
|
||||
with:
|
||||
category: "/language:${{matrix.language}}"
|
||||
|
||||
@@ -72,11 +72,10 @@ It contains hundreds of controls covering CIS, NIST 800, NIST CSF, CISA, RBI, Fe
|
||||
| Provider | Checks | Services | [Compliance Frameworks](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/compliance/) | [Categories](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/misc/#categories) |
|
||||
|---|---|---|---|---|
|
||||
| AWS | 564 | 82 | 33 | 10 |
|
||||
| GCP | 78 | 13 | 7 | 3 |
|
||||
| GCP | 77 | 13 | 6 | 3 |
|
||||
| Azure | 140 | 18 | 7 | 3 |
|
||||
| Kubernetes | 83 | 7 | 4 | 7 |
|
||||
| Microsoft365 | 5 | 2 | 1 | 0 |
|
||||
| NHN (Unofficial) | 6 | 2 | 1 | 0 |
|
||||
|
||||
> You can list the checks, services, compliance frameworks and categories with `prowler <provider> --list-checks`, `prowler <provider> --list-services`, `prowler <provider> --list-compliance` and `prowler <provider> --list-categories`.
|
||||
|
||||
|
||||
@@ -53,6 +53,3 @@ DJANGO_GOOGLE_OAUTH_CALLBACK_URL=""
|
||||
DJANGO_GITHUB_OAUTH_CLIENT_ID=""
|
||||
DJANGO_GITHUB_OAUTH_CLIENT_SECRET=""
|
||||
DJANGO_GITHUB_OAUTH_CALLBACK_URL=""
|
||||
|
||||
# Deletion Task Batch Size
|
||||
DJANGO_DELETION_BATCH_SIZE=5000
|
||||
|
||||
@@ -4,37 +4,12 @@ All notable changes to the **Prowler API** are documented in this file.
|
||||
|
||||
---
|
||||
|
||||
## [v1.6.0] (Prowler v5.5.0)
|
||||
## [v1.6.0] (Prowler UNRELEASED)
|
||||
|
||||
### Added
|
||||
|
||||
- Support for developing new integrations [(#7167)](https://github.com/prowler-cloud/prowler/pull/7167).
|
||||
- HTTP Security Headers [(#7289)](https://github.com/prowler-cloud/prowler/pull/7289).
|
||||
- New endpoint to get the compliance overviews metadata [(#7333)](https://github.com/prowler-cloud/prowler/pull/7333).
|
||||
- Support for muted findings [(#7378)](https://github.com/prowler-cloud/prowler/pull/7378).
|
||||
- Added missing fields to API findings and resources [(#7318)](https://github.com/prowler-cloud/prowler/pull/7318).
|
||||
|
||||
---
|
||||
|
||||
## [v1.5.4] (Prowler v5.4.4)
|
||||
|
||||
### Fixed
|
||||
- Fixed a bug with periodic tasks when trying to delete a provider ([#7466])(https://github.com/prowler-cloud/prowler/pull/7466).
|
||||
|
||||
---
|
||||
|
||||
## [v1.5.3] (Prowler v5.4.3)
|
||||
|
||||
### Fixed
|
||||
- Added duplicated scheduled scans handling ([#7401])(https://github.com/prowler-cloud/prowler/pull/7401).
|
||||
- Added environment variable to configure the deletion task batch size ([#7423])(https://github.com/prowler-cloud/prowler/pull/7423).
|
||||
|
||||
---
|
||||
|
||||
## [v1.5.2] (Prowler v5.4.2)
|
||||
|
||||
### Changed
|
||||
- Refactored deletion logic and implemented retry mechanism for deletion tasks [(#7349)](https://github.com/prowler-cloud/prowler/pull/7349).
|
||||
|
||||
---
|
||||
|
||||
@@ -46,10 +21,6 @@ All notable changes to the **Prowler API** are documented in this file.
|
||||
- Handled exception when a provider has no secret in test connection [(#7283)](https://github.com/prowler-cloud/prowler/pull/7283).
|
||||
|
||||
|
||||
### Added
|
||||
|
||||
- Support for developing new integrations [(#7167)](https://github.com/prowler-cloud/prowler/pull/7167).
|
||||
|
||||
---
|
||||
|
||||
## [v1.5.0] (Prowler v5.4.0)
|
||||
|
||||
106
api/poetry.lock
generated
@@ -331,14 +331,14 @@ aio = ["aiohttp (>=3.0)"]
|
||||
|
||||
[[package]]
|
||||
name = "azure-identity"
|
||||
version = "1.21.0"
|
||||
version = "1.19.0"
|
||||
description = "Microsoft Azure Identity Library for Python"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "azure_identity-1.21.0-py3-none-any.whl", hash = "sha256:258ea6325537352440f71b35c3dffe9d240eae4a5126c1b7ce5efd5766bd9fd9"},
|
||||
{file = "azure_identity-1.21.0.tar.gz", hash = "sha256:ea22ce6e6b0f429bc1b8d9212d5b9f9877bd4c82f1724bfa910760612c07a9a6"},
|
||||
{file = "azure_identity-1.19.0-py3-none-any.whl", hash = "sha256:e3f6558c181692d7509f09de10cca527c7dce426776454fb97df512a46527e81"},
|
||||
{file = "azure_identity-1.19.0.tar.gz", hash = "sha256:500144dc18197d7019b81501165d4fa92225f03778f17d7ca8a2a180129a9c83"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
@@ -368,21 +368,20 @@ typing-extensions = ">=4.0.1"
|
||||
|
||||
[[package]]
|
||||
name = "azure-mgmt-applicationinsights"
|
||||
version = "4.1.0"
|
||||
version = "4.0.0"
|
||||
description = "Microsoft Azure Application Insights Management Client Library for Python"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
python-versions = ">=3.7"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "azure_mgmt_applicationinsights-4.1.0-py3-none-any.whl", hash = "sha256:9e71f29b01e505a773501451d12fd6a10482cf4b13e9ac2bff72f5380496d979"},
|
||||
{file = "azure_mgmt_applicationinsights-4.1.0.tar.gz", hash = "sha256:15531390f12ce3d767cd3f1949af36aa39077c145c952fec4d80303c86ec7b6c"},
|
||||
{file = "azure-mgmt-applicationinsights-4.0.0.zip", hash = "sha256:50c3db05573e0cc2d56314a0556fb346ef05ec489ac000f4d720d92c6b647e06"},
|
||||
{file = "azure_mgmt_applicationinsights-4.0.0-py3-none-any.whl", hash = "sha256:2b1ffd9a0114974455795c73a3a5d17c849e32b961d707d2db393b99254b576f"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
azure-common = ">=1.1"
|
||||
azure-mgmt-core = ">=1.3.2"
|
||||
isodate = ">=0.6.1"
|
||||
typing-extensions = ">=4.6.0"
|
||||
azure-common = ">=1.1,<2.0"
|
||||
azure-mgmt-core = ">=1.3.2,<2.0.0"
|
||||
isodate = ">=0.6.1,<1.0.0"
|
||||
|
||||
[[package]]
|
||||
name = "azure-mgmt-authorization"
|
||||
@@ -421,32 +420,31 @@ typing-extensions = ">=4.6.0"
|
||||
|
||||
[[package]]
|
||||
name = "azure-mgmt-containerregistry"
|
||||
version = "12.0.0"
|
||||
version = "10.3.0"
|
||||
description = "Microsoft Azure Container Registry Client Library for Python"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
python-versions = ">=3.7"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "azure_mgmt_containerregistry-12.0.0-py3-none-any.whl", hash = "sha256:464abd4d3d9ecc0456ed8f63a6b9b93afc2e3e194f2d34f26a758afb67ad3b5c"},
|
||||
{file = "azure_mgmt_containerregistry-12.0.0.tar.gz", hash = "sha256:f19f8faa7881deaf2b5015c0eb050a92e2380cd9d18dee33cdb5f27d44a06c03"},
|
||||
{file = "azure-mgmt-containerregistry-10.3.0.tar.gz", hash = "sha256:ae21651855dfb19c42d91d6b3a965c6c611e23f8bc4bf7138835e652d2f918e3"},
|
||||
{file = "azure_mgmt_containerregistry-10.3.0-py3-none-any.whl", hash = "sha256:851e1c57f9bc4a3589c6b21fb627c11fd6cbb57a0388b7dfccd530ba3160805f"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
azure-common = ">=1.1"
|
||||
azure-mgmt-core = ">=1.3.2"
|
||||
isodate = ">=0.6.1"
|
||||
typing-extensions = ">=4.6.0"
|
||||
azure-common = ">=1.1,<2.0"
|
||||
azure-mgmt-core = ">=1.3.2,<2.0.0"
|
||||
isodate = ">=0.6.1,<1.0.0"
|
||||
|
||||
[[package]]
|
||||
name = "azure-mgmt-containerservice"
|
||||
version = "34.1.0"
|
||||
version = "34.0.0"
|
||||
description = "Microsoft Azure Container Service Management Client Library for Python"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "azure_mgmt_containerservice-34.1.0-py3-none-any.whl", hash = "sha256:1faa1714e0100c6ee4cfb8d2eadb1c270b548a84b0070c74e9fe646056a5cb12"},
|
||||
{file = "azure_mgmt_containerservice-34.1.0.tar.gz", hash = "sha256:637a6cf8f06636c016ad151d76f9c7ba75bd05d4334b3dd7837eb8b517f30dbe"},
|
||||
{file = "azure_mgmt_containerservice-34.0.0-py3-none-any.whl", hash = "sha256:34be8172241e3c2c444682407970a938f60e3b2bd06304eaae0a1ba641f2262d"},
|
||||
{file = "azure_mgmt_containerservice-34.0.0.tar.gz", hash = "sha256:822d07828b746a5ea5408a8b3770f41dc424d6c4c28de53c29611b62bef8aea3"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
@@ -560,14 +558,14 @@ msrest = ">=0.6.21"
|
||||
|
||||
[[package]]
|
||||
name = "azure-mgmt-resource"
|
||||
version = "23.3.0"
|
||||
version = "23.2.0"
|
||||
description = "Microsoft Azure Resource Management Client Library for Python"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "azure_mgmt_resource-23.3.0-py3-none-any.whl", hash = "sha256:ab216ee28e29db6654b989746e0c85a1181f66653929d2cb6e48fba66d9af323"},
|
||||
{file = "azure_mgmt_resource-23.3.0.tar.gz", hash = "sha256:fc4f1fd8b6aad23f8af4ed1f913df5f5c92df117449dc354fea6802a2829fea4"},
|
||||
{file = "azure_mgmt_resource-23.2.0-py3-none-any.whl", hash = "sha256:7af2bca928ecd58e57ea7f7731d245f45e9d927036d82f1d30b96baa0c26b569"},
|
||||
{file = "azure_mgmt_resource-23.2.0.tar.gz", hash = "sha256:747b750df7af23ab30e53d3f36247ab0c16de1e267d666b1a5077c39a4292529"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
@@ -629,21 +627,20 @@ msrest = ">=0.6.21"
|
||||
|
||||
[[package]]
|
||||
name = "azure-mgmt-storage"
|
||||
version = "22.1.1"
|
||||
version = "21.2.1"
|
||||
description = "Microsoft Azure Storage Management Client Library for Python"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "azure_mgmt_storage-22.1.1-py3-none-any.whl", hash = "sha256:a4a4064918dcfa4f1cbebada5bf064935d66f2a3647a2f46a1f1c9348736f5d9"},
|
||||
{file = "azure_mgmt_storage-22.1.1.tar.gz", hash = "sha256:25aaa5ae8c40c30e2f91f8aae6f52906b0557e947d5c1b9817d4ff9decc11340"},
|
||||
{file = "azure-mgmt-storage-21.2.1.tar.gz", hash = "sha256:503a7ff9c31254092b0656445f5728bfdfda2d09d46a82e97019eaa9a1ecec64"},
|
||||
{file = "azure_mgmt_storage-21.2.1-py3-none-any.whl", hash = "sha256:f97df1fa39cde9dbacf2cd96c9cba1fc196932185e24853e276f74b18a0bd031"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
azure-common = ">=1.1"
|
||||
azure-mgmt-core = ">=1.3.2"
|
||||
isodate = ">=0.6.1"
|
||||
typing-extensions = ">=4.6.0"
|
||||
|
||||
[[package]]
|
||||
name = "azure-mgmt-subscription"
|
||||
@@ -2094,14 +2091,14 @@ grpcio-gcp = ["grpcio-gcp (>=0.2.2,<1.0.dev0)"]
|
||||
|
||||
[[package]]
|
||||
name = "google-api-python-client"
|
||||
version = "2.163.0"
|
||||
version = "2.161.0"
|
||||
description = "Google API Client Library for Python"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "google_api_python_client-2.163.0-py2.py3-none-any.whl", hash = "sha256:080e8bc0669cb4c1fb8efb8da2f5b91a2625d8f0e7796cfad978f33f7016c6c4"},
|
||||
{file = "google_api_python_client-2.163.0.tar.gz", hash = "sha256:88dee87553a2d82176e2224648bf89272d536c8f04dcdda37ef0a71473886dd7"},
|
||||
{file = "google_api_python_client-2.161.0-py2.py3-none-any.whl", hash = "sha256:9476a5a4f200bae368140453df40f9cda36be53fa7d0e9a9aac4cdb859a26448"},
|
||||
{file = "google_api_python_client-2.161.0.tar.gz", hash = "sha256:324c0cce73e9ea0a0d2afd5937e01b7c2d6a4d7e2579cdb6c384f9699d6c9f37"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
@@ -2437,14 +2434,14 @@ files = [
|
||||
|
||||
[[package]]
|
||||
name = "jinja2"
|
||||
version = "3.1.5"
|
||||
version = "3.1.6"
|
||||
description = "A very fast and expressive template engine."
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
groups = ["main", "dev"]
|
||||
files = [
|
||||
{file = "jinja2-3.1.5-py3-none-any.whl", hash = "sha256:aba0f4dc9ed8013c424088f68a5c226f7d6097ed89b246d7749c2ec4175c6adb"},
|
||||
{file = "jinja2-3.1.5.tar.gz", hash = "sha256:8fefff8dc3034e27bb80d67c671eb8a9bc424c0ef4c0826edbff304cceff43bb"},
|
||||
{file = "jinja2-3.1.6-py3-none-any.whl", hash = "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67"},
|
||||
{file = "jinja2-3.1.6.tar.gz", hash = "sha256:0137fb05990d35f1275a587e9aee6d56da821fc83491a0fb838183be43f66d6d"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
@@ -3577,7 +3574,7 @@ files = [
|
||||
|
||||
[[package]]
|
||||
name = "prowler"
|
||||
version = "5.5.0"
|
||||
version = "5.4.0"
|
||||
description = "Prowler is an Open Source security tool to perform AWS, GCP and Azure security best practices assessments, audits, incident response, continuous monitoring, hardening and forensics readiness. It contains hundreds of controls covering CIS, NIST 800, NIST CSF, CISA, RBI, FedRAMP, PCI-DSS, GDPR, HIPAA, FFIEC, SOC2, GXP, AWS Well-Architected Framework Security Pillar, AWS Foundational Technical Review (FTR), ENS (Spanish National Security Scheme) and your custom security frameworks."
|
||||
optional = false
|
||||
python-versions = ">3.9.1,<3.13"
|
||||
@@ -3588,23 +3585,23 @@ develop = false
|
||||
[package.dependencies]
|
||||
alive-progress = "3.2.0"
|
||||
awsipranges = "0.3.3"
|
||||
azure-identity = "1.21.0"
|
||||
azure-identity = "1.19.0"
|
||||
azure-keyvault-keys = "4.10.0"
|
||||
azure-mgmt-applicationinsights = "4.1.0"
|
||||
azure-mgmt-applicationinsights = "4.0.0"
|
||||
azure-mgmt-authorization = "4.0.0"
|
||||
azure-mgmt-compute = "34.0.0"
|
||||
azure-mgmt-containerregistry = "12.0.0"
|
||||
azure-mgmt-containerservice = "34.1.0"
|
||||
azure-mgmt-containerregistry = "10.3.0"
|
||||
azure-mgmt-containerservice = "34.0.0"
|
||||
azure-mgmt-cosmosdb = "9.7.0"
|
||||
azure-mgmt-keyvault = "10.3.1"
|
||||
azure-mgmt-monitor = "6.0.2"
|
||||
azure-mgmt-network = "28.1.0"
|
||||
azure-mgmt-rdbms = "10.1.0"
|
||||
azure-mgmt-resource = "23.3.0"
|
||||
azure-mgmt-resource = "23.2.0"
|
||||
azure-mgmt-search = "9.1.0"
|
||||
azure-mgmt-security = "7.0.0"
|
||||
azure-mgmt-sql = "3.0.1"
|
||||
azure-mgmt-storage = "22.1.1"
|
||||
azure-mgmt-storage = "21.2.1"
|
||||
azure-mgmt-subscription = "3.1.1"
|
||||
azure-mgmt-web = "8.0.0"
|
||||
azure-storage-blob = "12.24.1"
|
||||
@@ -3615,7 +3612,7 @@ cryptography = "44.0.1"
|
||||
dash = "2.18.2"
|
||||
dash-bootstrap-components = "1.6.0"
|
||||
detect-secrets = "1.5.0"
|
||||
google-api-python-client = "2.163.0"
|
||||
google-api-python-client = "2.161.0"
|
||||
google-auth-httplib2 = ">=0.1,<0.3"
|
||||
jsonschema = "4.23.0"
|
||||
kubernetes = "32.0.1"
|
||||
@@ -3625,19 +3622,19 @@ numpy = "2.0.2"
|
||||
pandas = "2.2.3"
|
||||
py-ocsf-models = "0.3.1"
|
||||
pydantic = "1.10.21"
|
||||
python-dateutil = ">=2.9.0.post0,<3.0.0"
|
||||
python-dateutil = "^2.9.0.post0"
|
||||
pytz = "2025.1"
|
||||
schema = "0.7.7"
|
||||
shodan = "1.31.0"
|
||||
slack-sdk = "3.34.0"
|
||||
tabulate = "0.9.0"
|
||||
tzlocal = "5.3.1"
|
||||
tzlocal = "5.2"
|
||||
|
||||
[package.source]
|
||||
type = "git"
|
||||
url = "https://github.com/prowler-cloud/prowler.git"
|
||||
reference = "v5.5"
|
||||
resolved_reference = "c1b70ed0fc6bd586c8c69a121411f1d710d94845"
|
||||
reference = "master"
|
||||
resolved_reference = "931df361bf5ee287139112f485580241099094cb"
|
||||
|
||||
[[package]]
|
||||
name = "psutil"
|
||||
@@ -4608,7 +4605,6 @@ files = [
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f66efbc1caa63c088dead1c4170d148eabc9b80d95fb75b6c92ac0aad2437d76"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:22353049ba4181685023b25b5b51a574bce33e7f51c759371a7422dcae5402a6"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:932205970b9f9991b34f55136be327501903f7c66830e9760a8ffb15b07f05cd"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:a52d48f4e7bf9005e8f0a89209bf9a73f7190ddf0489eee5eb51377385f59f2a"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-win32.whl", hash = "sha256:3eac5a91891ceb88138c113f9db04f3cebdae277f5d44eaa3651a4f573e6a5da"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-win_amd64.whl", hash = "sha256:ab007f2f5a87bd08ab1499bdf96f3d5c6ad4dcfa364884cb4549aa0154b13a28"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-macosx_13_0_arm64.whl", hash = "sha256:4a6679521a58256a90b0d89e03992c15144c5f3858f40d7c18886023d7943db6"},
|
||||
@@ -4617,7 +4613,6 @@ files = [
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:811ea1594b8a0fb466172c384267a4e5e367298af6b228931f273b111f17ef52"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:cf12567a7b565cbf65d438dec6cfbe2917d3c1bdddfce84a9930b7d35ea59642"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:7dd5adc8b930b12c8fc5b99e2d535a09889941aa0d0bd06f4749e9a9397c71d2"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:1492a6051dab8d912fc2adeef0e8c72216b24d57bd896ea607cb90bb0c4981d3"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-win32.whl", hash = "sha256:bd0a08f0bab19093c54e18a14a10b4322e1eacc5217056f3c063bd2f59853ce4"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-win_amd64.whl", hash = "sha256:a274fb2cb086c7a3dea4322ec27f4cb5cc4b6298adb583ab0e211a4682f241eb"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:20b0f8dc160ba83b6dcc0e256846e1a02d044e13f7ea74a3d1d56ede4e48c632"},
|
||||
@@ -4626,7 +4621,6 @@ files = [
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:749c16fcc4a2b09f28843cda5a193e0283e47454b63ec4b81eaa2242f50e4ccd"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:bf165fef1f223beae7333275156ab2022cffe255dcc51c27f066b4370da81e31"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:32621c177bbf782ca5a18ba4d7af0f1082a3f6e517ac2a18b3974d4edf349680"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:b82a7c94a498853aa0b272fd5bc67f29008da798d4f93a2f9f289feb8426a58d"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-win32.whl", hash = "sha256:e8c4ebfcfd57177b572e2040777b8abc537cdef58a2120e830124946aa9b42c5"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-win_amd64.whl", hash = "sha256:0467c5965282c62203273b838ae77c0d29d7638c8a4e3a1c8bdd3602c10904e4"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:4c8c5d82f50bb53986a5e02d1b3092b03622c02c2eb78e29bec33fd9593bae1a"},
|
||||
@@ -4635,7 +4629,6 @@ files = [
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:96777d473c05ee3e5e3c3e999f5d23c6f4ec5b0c38c098b3a5229085f74236c6"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-musllinux_1_1_i686.whl", hash = "sha256:3bc2a80e6420ca8b7d3590791e2dfc709c88ab9152c00eeb511c9875ce5778bf"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:e188d2699864c11c36cdfdada94d781fd5d6b0071cd9c427bceb08ad3d7c70e1"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4f6f3eac23941b32afccc23081e1f50612bdbe4e982012ef4f5797986828cd01"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-win32.whl", hash = "sha256:6442cb36270b3afb1b4951f060eccca1ce49f3d087ca1ca4563a6eb479cb3de6"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-win_amd64.whl", hash = "sha256:e5b8daf27af0b90da7bb903a876477a9e6d7270be6146906b276605997c7e9a3"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-macosx_12_0_arm64.whl", hash = "sha256:fc4b630cd3fa2cf7fce38afa91d7cfe844a9f75d7f0f36393fa98815e911d987"},
|
||||
@@ -4644,7 +4637,6 @@ files = [
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e2f1c3765db32be59d18ab3953f43ab62a761327aafc1594a2a1fbe038b8b8a7"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:d85252669dc32f98ebcd5d36768f5d4faeaeaa2d655ac0473be490ecdae3c285"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:e143ada795c341b56de9418c58d028989093ee611aa27ffb9b7f609c00d813ed"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:2c59aa6170b990d8d2719323e628aaf36f3bfbc1c26279c0eeeb24d05d2d11c7"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-win32.whl", hash = "sha256:beffaed67936fbbeffd10966a4eb53c402fafd3d6833770516bf7314bc6ffa12"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-win_amd64.whl", hash = "sha256:040ae85536960525ea62868b642bdb0c2cc6021c9f9d507810c0c604e66f5a7b"},
|
||||
{file = "ruamel.yaml.clib-0.2.12.tar.gz", hash = "sha256:6c8fbb13ec503f99a91901ab46e0b07ae7941cd527393187039aec586fdfd36f"},
|
||||
@@ -5086,14 +5078,14 @@ markers = {dev = "sys_platform == \"win32\""}
|
||||
|
||||
[[package]]
|
||||
name = "tzlocal"
|
||||
version = "5.3.1"
|
||||
version = "5.2"
|
||||
description = "tzinfo object for the local timezone"
|
||||
optional = false
|
||||
python-versions = ">=3.9"
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "tzlocal-5.3.1-py3-none-any.whl", hash = "sha256:eb1a66c3ef5847adf7a834f1be0800581b683b5608e74f86ecbcef8ab91bb85d"},
|
||||
{file = "tzlocal-5.3.1.tar.gz", hash = "sha256:cceffc7edecefea1f595541dbd6e990cb1ea3d19bf01b2809f362a03dd7921fd"},
|
||||
{file = "tzlocal-5.2-py3-none-any.whl", hash = "sha256:49816ef2fe65ea8ac19d19aa7a1ae0551c834303d5014c6d5a62e4cbda8047b8"},
|
||||
{file = "tzlocal-5.2.tar.gz", hash = "sha256:8d399205578f1a9342816409cc1e46a93ebd5755e39ea2d85334bea911bf0e6e"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
@@ -5436,4 +5428,4 @@ type = ["pytest-mypy"]
|
||||
[metadata]
|
||||
lock-version = "2.1"
|
||||
python-versions = ">=3.11,<3.13"
|
||||
content-hash = "7dae4c2a4c35a6d226e1ec9ff8fc75784d2c008810929586dca5c8f095c515a3"
|
||||
content-hash = "5892eb6b702f29a7eebff3f9dc899ea8d78cac622aa8f299ef4c662df8f5b51d"
|
||||
|
||||
@@ -8,7 +8,6 @@ dependencies = [
|
||||
"celery[pytest] (>=5.4.0,<6.0.0)",
|
||||
"dj-rest-auth[with_social,jwt] (==7.0.1)",
|
||||
"django==5.1.7",
|
||||
"django-allauth==65.4.1",
|
||||
"django-celery-beat (>=2.7.0,<3.0.0)",
|
||||
"django-celery-results (>=2.5.1,<3.0.0)",
|
||||
"django-cors-headers==4.4.0",
|
||||
@@ -23,7 +22,7 @@ dependencies = [
|
||||
"drf-spectacular==0.27.2",
|
||||
"drf-spectacular-jsonapi==0.5.1",
|
||||
"gunicorn==23.0.0",
|
||||
"prowler @ git+https://github.com/prowler-cloud/prowler.git@v5.5",
|
||||
"prowler @ git+https://github.com/prowler-cloud/prowler.git@master",
|
||||
"psycopg2-binary==2.9.9",
|
||||
"pytest-celery[redis] (>=1.0.1,<2.0.0)",
|
||||
"sentry-sdk[django] (>=2.20.0,<3.0.0)",
|
||||
|
||||
@@ -6,7 +6,6 @@ from datetime import datetime, timedelta, timezone
|
||||
from django.conf import settings
|
||||
from django.contrib.auth.models import BaseUserManager
|
||||
from django.db import connection, models, transaction
|
||||
from django_celery_beat.models import PeriodicTask
|
||||
from psycopg2 import connect as psycopg2_connect
|
||||
from psycopg2.extensions import AsIs, new_type, register_adapter, register_type
|
||||
from rest_framework_json_api.serializers import ValidationError
|
||||
@@ -106,12 +105,11 @@ def generate_random_token(length: int = 14, symbols: str | None = None) -> str:
|
||||
return "".join(secrets.choice(symbols or _symbols) for _ in range(length))
|
||||
|
||||
|
||||
def batch_delete(tenant_id, queryset, batch_size=settings.DJANGO_DELETION_BATCH_SIZE):
|
||||
def batch_delete(queryset, batch_size=5000):
|
||||
"""
|
||||
Deletes objects in batches and returns the total number of deletions and a summary.
|
||||
|
||||
Args:
|
||||
tenant_id (str): Tenant ID the queryset belongs to.
|
||||
queryset (QuerySet): The queryset of objects to delete.
|
||||
batch_size (int): The number of objects to delete in each batch.
|
||||
|
||||
@@ -122,16 +120,15 @@ def batch_delete(tenant_id, queryset, batch_size=settings.DJANGO_DELETION_BATCH_
|
||||
deletion_summary = {}
|
||||
|
||||
while True:
|
||||
with rls_transaction(tenant_id, POSTGRES_TENANT_VAR):
|
||||
# Get a batch of IDs to delete
|
||||
batch_ids = set(
|
||||
queryset.values_list("id", flat=True).order_by("id")[:batch_size]
|
||||
)
|
||||
if not batch_ids:
|
||||
# No more objects to delete
|
||||
break
|
||||
# Get a batch of IDs to delete
|
||||
batch_ids = set(
|
||||
queryset.values_list("id", flat=True).order_by("id")[:batch_size]
|
||||
)
|
||||
if not batch_ids:
|
||||
# No more objects to delete
|
||||
break
|
||||
|
||||
deleted_count, deleted_info = queryset.filter(id__in=batch_ids).delete()
|
||||
deleted_count, deleted_info = queryset.filter(id__in=batch_ids).delete()
|
||||
|
||||
total_deleted += deleted_count
|
||||
for model_label, count in deleted_info.items():
|
||||
@@ -140,18 +137,6 @@ def batch_delete(tenant_id, queryset, batch_size=settings.DJANGO_DELETION_BATCH_
|
||||
return total_deleted, deletion_summary
|
||||
|
||||
|
||||
def delete_related_daily_task(provider_id: str):
|
||||
"""
|
||||
Deletes the periodic task associated with a specific provider.
|
||||
|
||||
Args:
|
||||
provider_id (str): The unique identifier for the provider
|
||||
whose related periodic task should be deleted.
|
||||
"""
|
||||
task_name = f"scan-perform-scheduled-{provider_id}"
|
||||
PeriodicTask.objects.filter(name=task_name).delete()
|
||||
|
||||
|
||||
# Postgres Enums
|
||||
|
||||
|
||||
|
||||
@@ -287,9 +287,6 @@ class FindingFilter(FilterSet):
|
||||
status = ChoiceFilter(choices=StatusChoices.choices)
|
||||
severity = ChoiceFilter(choices=SeverityChoices)
|
||||
impact = ChoiceFilter(choices=SeverityChoices)
|
||||
muted = BooleanFilter(
|
||||
help_text="If this filter is not provided, muted and non-muted findings will be returned."
|
||||
)
|
||||
|
||||
resources = UUIDInFilter(field_name="resource__id", lookup_expr="in")
|
||||
|
||||
@@ -617,6 +614,12 @@ class ScanSummaryFilter(FilterSet):
|
||||
field_name="scan__provider__provider", choices=Provider.ProviderChoices.choices
|
||||
)
|
||||
region = CharFilter(field_name="region")
|
||||
muted_findings = BooleanFilter(method="filter_muted_findings")
|
||||
|
||||
def filter_muted_findings(self, queryset, name, value):
|
||||
if not value:
|
||||
return queryset.exclude(muted__gt=0)
|
||||
return queryset
|
||||
|
||||
class Meta:
|
||||
model = ScanSummary
|
||||
@@ -627,6 +630,8 @@ class ScanSummaryFilter(FilterSet):
|
||||
|
||||
|
||||
class ServiceOverviewFilter(ScanSummaryFilter):
|
||||
muted_findings = None
|
||||
|
||||
def is_valid(self):
|
||||
# Check if at least one of the inserted_at filters is present
|
||||
inserted_at_filters = [
|
||||
|
||||
@@ -1,26 +0,0 @@
|
||||
# Generated by Django 5.1.5 on 2025-03-25 11:29
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
import api.db_utils
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("api", "0014_integrations"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name="finding",
|
||||
name="muted",
|
||||
field=models.BooleanField(default=False),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="finding",
|
||||
name="status",
|
||||
field=api.db_utils.StatusEnumField(
|
||||
choices=[("FAIL", "Fail"), ("PASS", "Pass"), ("MANUAL", "Manual")]
|
||||
),
|
||||
),
|
||||
]
|
||||
@@ -1,32 +0,0 @@
|
||||
# Generated by Django 5.1.5 on 2025-03-31 10:46
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("api", "0015_finding_muted"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name="finding",
|
||||
name="compliance",
|
||||
field=models.JSONField(blank=True, default=dict, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="resource",
|
||||
name="details",
|
||||
field=models.TextField(blank=True, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="resource",
|
||||
name="metadata",
|
||||
field=models.TextField(blank=True, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="resource",
|
||||
name="partition",
|
||||
field=models.TextField(blank=True, null=True),
|
||||
),
|
||||
]
|
||||
@@ -59,6 +59,7 @@ class StatusChoices(models.TextChoices):
|
||||
FAIL = "FAIL", _("Fail")
|
||||
PASS = "PASS", _("Pass")
|
||||
MANUAL = "MANUAL", _("Manual")
|
||||
MUTED = "MUTED", _("Muted")
|
||||
|
||||
|
||||
class StateChoices(models.TextChoices):
|
||||
@@ -518,11 +519,6 @@ class Resource(RowLevelSecurityProtectedModel):
|
||||
editable=False,
|
||||
)
|
||||
|
||||
metadata = models.TextField(blank=True, null=True)
|
||||
details = models.TextField(blank=True, null=True)
|
||||
partition = models.TextField(blank=True, null=True)
|
||||
|
||||
# Relationships
|
||||
tags = models.ManyToManyField(
|
||||
ResourceTag,
|
||||
verbose_name="Tags associated with the resource, by provider",
|
||||
@@ -660,8 +656,6 @@ class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
|
||||
tags = models.JSONField(default=dict, null=True, blank=True)
|
||||
check_id = models.CharField(max_length=100, blank=False, null=False)
|
||||
check_metadata = models.JSONField(default=dict, null=False)
|
||||
muted = models.BooleanField(default=False, null=False)
|
||||
compliance = models.JSONField(default=dict, null=True, blank=True)
|
||||
|
||||
# Relationships
|
||||
scan = models.ForeignKey(to=Scan, related_name="findings", on_delete=models.CASCADE)
|
||||
|
||||
@@ -1,12 +1,12 @@
|
||||
from celery import states
|
||||
from celery.signals import before_task_publish
|
||||
from config.celery import celery_app
|
||||
from django.db.models.signals import post_delete
|
||||
from django.dispatch import receiver
|
||||
from django_celery_beat.models import PeriodicTask
|
||||
from django_celery_results.backends.database import DatabaseBackend
|
||||
|
||||
from api.db_utils import delete_related_daily_task
|
||||
from api.models import Provider
|
||||
from config.celery import celery_app
|
||||
|
||||
|
||||
def create_task_result_on_publish(sender=None, headers=None, **kwargs): # noqa: F841
|
||||
@@ -31,4 +31,5 @@ before_task_publish.connect(
|
||||
@receiver(post_delete, sender=Provider)
|
||||
def delete_provider_scan_task(sender, instance, **kwargs): # noqa: F841
|
||||
# Delete the associated periodic task when the provider is deleted
|
||||
delete_related_daily_task(instance.id)
|
||||
task_name = f"scan-perform-scheduled-{instance.id}"
|
||||
PeriodicTask.objects.filter(name=task_name).delete()
|
||||
|
||||
@@ -233,42 +233,6 @@ paths:
|
||||
schema:
|
||||
$ref: '#/components/schemas/ComplianceOverviewFullResponse'
|
||||
description: ''
|
||||
/api/v1/compliance-overviews/metadata:
|
||||
get:
|
||||
operationId: compliance_overviews_metadata_retrieve
|
||||
description: Fetch unique metadata values from a set of compliance overviews.
|
||||
This is useful for dynamic filtering.
|
||||
summary: Retrieve metadata values from compliance overviews
|
||||
parameters:
|
||||
- in: query
|
||||
name: fields[compliance-overviews-metadata]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
enum:
|
||||
- regions
|
||||
description: endpoint return only specific fields in the response on a per-type
|
||||
basis by including a fields[TYPE] query parameter.
|
||||
explode: false
|
||||
- in: query
|
||||
name: filter[scan_id]
|
||||
schema:
|
||||
type: string
|
||||
format: uuid
|
||||
description: Related scan ID.
|
||||
required: true
|
||||
tags:
|
||||
- Compliance Overview
|
||||
security:
|
||||
- jwtAuth: []
|
||||
responses:
|
||||
'200':
|
||||
content:
|
||||
application/vnd.api+json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/ComplianceOverviewMetadataResponse'
|
||||
description: ''
|
||||
/api/v1/findings:
|
||||
get:
|
||||
operationId: findings_list
|
||||
@@ -294,7 +258,6 @@ paths:
|
||||
- inserted_at
|
||||
- updated_at
|
||||
- first_seen_at
|
||||
- muted
|
||||
- url
|
||||
- scan
|
||||
- resources
|
||||
@@ -403,12 +366,6 @@ paths:
|
||||
type: string
|
||||
format: date
|
||||
description: Maximum date range is 7 days.
|
||||
- in: query
|
||||
name: filter[muted]
|
||||
schema:
|
||||
type: boolean
|
||||
description: If this filter is not provided, muted and non-muted findings
|
||||
will be returned.
|
||||
- in: query
|
||||
name: filter[provider]
|
||||
schema:
|
||||
@@ -640,11 +597,13 @@ paths:
|
||||
enum:
|
||||
- FAIL
|
||||
- MANUAL
|
||||
- MUTED
|
||||
- PASS
|
||||
description: |-
|
||||
* `FAIL` - Fail
|
||||
* `PASS` - Pass
|
||||
* `MANUAL` - Manual
|
||||
* `MUTED` - Muted
|
||||
- in: query
|
||||
name: filter[status__in]
|
||||
schema:
|
||||
@@ -761,7 +720,6 @@ paths:
|
||||
- inserted_at
|
||||
- updated_at
|
||||
- first_seen_at
|
||||
- muted
|
||||
- url
|
||||
- scan
|
||||
- resources
|
||||
@@ -915,12 +873,6 @@ paths:
|
||||
type: string
|
||||
format: date
|
||||
description: Maximum date range is 7 days.
|
||||
- in: query
|
||||
name: filter[muted]
|
||||
schema:
|
||||
type: boolean
|
||||
description: If this filter is not provided, muted and non-muted findings
|
||||
will be returned.
|
||||
- in: query
|
||||
name: filter[provider]
|
||||
schema:
|
||||
@@ -1152,11 +1104,13 @@ paths:
|
||||
enum:
|
||||
- FAIL
|
||||
- MANUAL
|
||||
- MUTED
|
||||
- PASS
|
||||
description: |-
|
||||
* `FAIL` - Fail
|
||||
* `PASS` - Pass
|
||||
* `MANUAL` - Manual
|
||||
* `MUTED` - Muted
|
||||
- in: query
|
||||
name: filter[status__in]
|
||||
schema:
|
||||
@@ -1348,12 +1302,6 @@ paths:
|
||||
type: string
|
||||
format: date
|
||||
description: Maximum date range is 7 days.
|
||||
- in: query
|
||||
name: filter[muted]
|
||||
schema:
|
||||
type: boolean
|
||||
description: If this filter is not provided, muted and non-muted findings
|
||||
will be returned.
|
||||
- in: query
|
||||
name: filter[provider]
|
||||
schema:
|
||||
@@ -1585,11 +1533,13 @@ paths:
|
||||
enum:
|
||||
- FAIL
|
||||
- MANUAL
|
||||
- MUTED
|
||||
- PASS
|
||||
description: |-
|
||||
* `FAIL` - Fail
|
||||
* `PASS` - Pass
|
||||
* `MANUAL` - Manual
|
||||
* `MUTED` - Muted
|
||||
- in: query
|
||||
name: filter[status__in]
|
||||
schema:
|
||||
@@ -2033,6 +1983,10 @@ paths:
|
||||
schema:
|
||||
type: string
|
||||
format: date-time
|
||||
- in: query
|
||||
name: filter[muted_findings]
|
||||
schema:
|
||||
type: boolean
|
||||
- in: query
|
||||
name: filter[provider_id]
|
||||
schema:
|
||||
@@ -2190,6 +2144,10 @@ paths:
|
||||
schema:
|
||||
type: string
|
||||
format: date-time
|
||||
- in: query
|
||||
name: filter[muted_findings]
|
||||
schema:
|
||||
type: boolean
|
||||
- in: query
|
||||
name: filter[provider_id]
|
||||
schema:
|
||||
@@ -3716,7 +3674,6 @@ paths:
|
||||
- name
|
||||
- manage_users
|
||||
- manage_account
|
||||
- manage_integrations
|
||||
- manage_providers
|
||||
- manage_scans
|
||||
- permission_state
|
||||
@@ -3835,8 +3792,6 @@ paths:
|
||||
- -manage_users
|
||||
- manage_account
|
||||
- -manage_account
|
||||
- manage_integrations
|
||||
- -manage_integrations
|
||||
- manage_providers
|
||||
- -manage_providers
|
||||
- manage_scans
|
||||
@@ -3912,7 +3867,6 @@ paths:
|
||||
- name
|
||||
- manage_users
|
||||
- manage_account
|
||||
- manage_integrations
|
||||
- manage_providers
|
||||
- manage_scans
|
||||
- permission_state
|
||||
@@ -4282,17 +4236,6 @@ paths:
|
||||
description: |-
|
||||
* `scheduled` - Scheduled
|
||||
* `manual` - Manual
|
||||
- in: query
|
||||
name: include
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
enum:
|
||||
- provider
|
||||
description: include query parameter to allow the client to customize which
|
||||
related resources should be returned.
|
||||
explode: false
|
||||
- name: page[number]
|
||||
required: false
|
||||
in: query
|
||||
@@ -4407,17 +4350,6 @@ paths:
|
||||
format: uuid
|
||||
description: A UUID string identifying this scan.
|
||||
required: true
|
||||
- in: query
|
||||
name: include
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
enum:
|
||||
- provider
|
||||
description: include query parameter to allow the client to customize which
|
||||
related resources should be returned.
|
||||
explode: false
|
||||
tags:
|
||||
- Scan
|
||||
security:
|
||||
@@ -6173,40 +6105,6 @@ components:
|
||||
$ref: '#/components/schemas/ComplianceOverviewFull'
|
||||
required:
|
||||
- data
|
||||
ComplianceOverviewMetadata:
|
||||
type: object
|
||||
required:
|
||||
- type
|
||||
- id
|
||||
additionalProperties: false
|
||||
properties:
|
||||
type:
|
||||
allOf:
|
||||
- $ref: '#/components/schemas/ComplianceOverviewMetadataTypeEnum'
|
||||
description: The [type](https://jsonapi.org/format/#document-resource-object-identification)
|
||||
member is used to describe resource objects that share common attributes
|
||||
and relationships.
|
||||
id: {}
|
||||
attributes:
|
||||
type: object
|
||||
properties:
|
||||
regions:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
required:
|
||||
- regions
|
||||
ComplianceOverviewMetadataResponse:
|
||||
type: object
|
||||
properties:
|
||||
data:
|
||||
$ref: '#/components/schemas/ComplianceOverviewMetadata'
|
||||
required:
|
||||
- data
|
||||
ComplianceOverviewMetadataTypeEnum:
|
||||
type: string
|
||||
enum:
|
||||
- compliance-overviews-metadata
|
||||
Finding:
|
||||
type: object
|
||||
required:
|
||||
@@ -6244,11 +6142,13 @@ components:
|
||||
- FAIL
|
||||
- PASS
|
||||
- MANUAL
|
||||
- MUTED
|
||||
type: string
|
||||
description: |-
|
||||
* `FAIL` - Fail
|
||||
* `PASS` - Pass
|
||||
* `MANUAL` - Manual
|
||||
* `MUTED` - Muted
|
||||
status_extended:
|
||||
type: string
|
||||
nullable: true
|
||||
@@ -6284,8 +6184,6 @@ components:
|
||||
format: date-time
|
||||
readOnly: true
|
||||
nullable: true
|
||||
muted:
|
||||
type: boolean
|
||||
required:
|
||||
- uid
|
||||
- status
|
||||
@@ -8500,8 +8398,6 @@ components:
|
||||
type: boolean
|
||||
manage_account:
|
||||
type: boolean
|
||||
manage_integrations:
|
||||
type: boolean
|
||||
manage_providers:
|
||||
type: boolean
|
||||
manage_scans:
|
||||
@@ -10154,8 +10050,6 @@ components:
|
||||
type: boolean
|
||||
manage_account:
|
||||
type: boolean
|
||||
manage_integrations:
|
||||
type: boolean
|
||||
manage_providers:
|
||||
type: boolean
|
||||
manage_scans:
|
||||
@@ -10285,8 +10179,6 @@ components:
|
||||
type: boolean
|
||||
manage_account:
|
||||
type: boolean
|
||||
manage_integrations:
|
||||
type: boolean
|
||||
manage_providers:
|
||||
type: boolean
|
||||
manage_scans:
|
||||
@@ -10421,8 +10313,6 @@ components:
|
||||
type: boolean
|
||||
manage_account:
|
||||
type: boolean
|
||||
manage_integrations:
|
||||
type: boolean
|
||||
manage_providers:
|
||||
type: boolean
|
||||
manage_scans:
|
||||
|
||||
@@ -131,10 +131,9 @@ class TestBatchDelete:
|
||||
return provider_count
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_batch_delete(self, tenants_fixture, create_test_providers):
|
||||
tenant_id = str(tenants_fixture[0].id)
|
||||
def test_batch_delete(self, create_test_providers):
|
||||
_, summary = batch_delete(
|
||||
tenant_id, Provider.objects.all(), batch_size=create_test_providers // 2
|
||||
Provider.objects.all(), batch_size=create_test_providers // 2
|
||||
)
|
||||
assert Provider.objects.all().count() == 0
|
||||
assert summary == {"api.Provider": create_test_providers}
|
||||
|
||||
@@ -14,7 +14,6 @@ from django.urls import reverse
|
||||
from rest_framework import status
|
||||
|
||||
from api.models import (
|
||||
ComplianceOverview,
|
||||
Integration,
|
||||
Invitation,
|
||||
Membership,
|
||||
@@ -2693,8 +2692,6 @@ class TestFindingViewSet:
|
||||
# ("resource_tags", "key:value", 2),
|
||||
# ("resource_tags", "not:exists", 0),
|
||||
# ("resource_tags", "not:exists,key:value", 2),
|
||||
("muted", True, 1),
|
||||
("muted", False, 1),
|
||||
]
|
||||
),
|
||||
)
|
||||
@@ -4511,33 +4508,6 @@ class TestComplianceOverviewViewSet:
|
||||
assert len(response.json()["data"]) == 1
|
||||
assert response.json()["data"][0]["id"] == str(compliance_overview1.id)
|
||||
|
||||
def test_compliance_overview_metadata(
|
||||
self, authenticated_client, compliance_overviews_fixture
|
||||
):
|
||||
response = authenticated_client.get(
|
||||
reverse("complianceoverview-metadata"),
|
||||
{"filter[scan_id]": str(compliance_overviews_fixture[0].scan_id)},
|
||||
)
|
||||
data = response.json()
|
||||
|
||||
expected_regions = set(
|
||||
ComplianceOverview.objects.all()
|
||||
.values_list("region", flat=True)
|
||||
.distinct("region")
|
||||
)
|
||||
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
assert data["data"]["type"] == "compliance-overviews-metadata"
|
||||
assert data["data"]["id"] is None
|
||||
assert set(data["data"]["attributes"]["regions"]) == expected_regions
|
||||
|
||||
def test_compliance_overview_metadata_missing_scan_id(self, authenticated_client):
|
||||
# Attempt to list compliance overviews without providing filter[scan_id]
|
||||
response = authenticated_client.get(reverse("complianceoverview-metadata"))
|
||||
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||
assert response.json()["errors"][0]["source"]["pointer"] == "filter[scan_id]"
|
||||
assert response.json()["errors"][0]["code"] == "required"
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
class TestOverviewViewSet:
|
||||
|
||||
@@ -851,10 +851,6 @@ class ScanSerializer(RLSSerializer):
|
||||
"url",
|
||||
]
|
||||
|
||||
included_serializers = {
|
||||
"provider": "api.v1.serializers.ProviderIncludeSerializer",
|
||||
}
|
||||
|
||||
|
||||
class ScanIncludeSerializer(RLSSerializer):
|
||||
trigger = serializers.ChoiceField(
|
||||
@@ -1091,7 +1087,6 @@ class FindingSerializer(RLSSerializer):
|
||||
"inserted_at",
|
||||
"updated_at",
|
||||
"first_seen_at",
|
||||
"muted",
|
||||
"url",
|
||||
# Relationships
|
||||
"scan",
|
||||
@@ -1902,13 +1897,6 @@ class ComplianceOverviewFullSerializer(ComplianceOverviewSerializer):
|
||||
return obj.requirements
|
||||
|
||||
|
||||
class ComplianceOverviewMetadataSerializer(serializers.Serializer):
|
||||
regions = serializers.ListField(child=serializers.CharField(), allow_empty=True)
|
||||
|
||||
class Meta:
|
||||
resource_name = "compliance-overviews-metadata"
|
||||
|
||||
|
||||
# Overviews
|
||||
|
||||
|
||||
|
||||
@@ -22,7 +22,6 @@ from django.http import HttpResponse
|
||||
from django.urls import reverse
|
||||
from django.utils.decorators import method_decorator
|
||||
from django.views.decorators.cache import cache_control
|
||||
from django_celery_beat.models import PeriodicTask
|
||||
from drf_spectacular.settings import spectacular_settings
|
||||
from drf_spectacular.utils import (
|
||||
OpenApiParameter,
|
||||
@@ -102,7 +101,6 @@ from api.rls import Tenant
|
||||
from api.utils import CustomOAuth2Client, validate_invitation
|
||||
from api.v1.serializers import (
|
||||
ComplianceOverviewFullSerializer,
|
||||
ComplianceOverviewMetadataSerializer,
|
||||
ComplianceOverviewSerializer,
|
||||
FindingDynamicFilterSerializer,
|
||||
FindingMetadataSerializer,
|
||||
@@ -1089,8 +1087,6 @@ class ProviderViewSet(BaseRLSViewSet):
|
||||
provider = get_object_or_404(Provider, pk=pk)
|
||||
provider.is_deleted = True
|
||||
provider.save()
|
||||
task_name = f"scan-perform-scheduled-{pk}"
|
||||
PeriodicTask.objects.filter(name=task_name).update(enabled=False)
|
||||
|
||||
with transaction.atomic():
|
||||
task = delete_provider_task.delay(
|
||||
@@ -2058,21 +2054,6 @@ class RoleProviderGroupRelationshipView(RelationshipView, BaseRLSViewSet):
|
||||
description="Fetch detailed information about a specific compliance overview by its ID, including detailed "
|
||||
"requirement information and check's status.",
|
||||
),
|
||||
metadata=extend_schema(
|
||||
tags=["Compliance Overview"],
|
||||
summary="Retrieve metadata values from compliance overviews",
|
||||
description="Fetch unique metadata values from a set of compliance overviews. This is useful for dynamic "
|
||||
"filtering.",
|
||||
parameters=[
|
||||
OpenApiParameter(
|
||||
name="filter[scan_id]",
|
||||
required=True,
|
||||
type=OpenApiTypes.UUID,
|
||||
location=OpenApiParameter.QUERY,
|
||||
description="Related scan ID.",
|
||||
),
|
||||
],
|
||||
),
|
||||
)
|
||||
@method_decorator(CACHE_DECORATOR, name="list")
|
||||
@method_decorator(CACHE_DECORATOR, name="retrieve")
|
||||
@@ -2134,8 +2115,6 @@ class ComplianceOverviewViewSet(BaseRLSViewSet):
|
||||
def get_serializer_class(self):
|
||||
if self.action == "retrieve":
|
||||
return ComplianceOverviewFullSerializer
|
||||
elif self.action == "metadata":
|
||||
return ComplianceOverviewMetadataSerializer
|
||||
return super().get_serializer_class()
|
||||
|
||||
def list(self, request, *args, **kwargs):
|
||||
@@ -2152,35 +2131,6 @@ class ComplianceOverviewViewSet(BaseRLSViewSet):
|
||||
)
|
||||
return super().list(request, *args, **kwargs)
|
||||
|
||||
@action(detail=False, methods=["get"], url_name="metadata")
|
||||
def metadata(self, request):
|
||||
scan_id = request.query_params.get("filter[scan_id]")
|
||||
if not scan_id:
|
||||
raise ValidationError(
|
||||
[
|
||||
{
|
||||
"detail": "This query parameter is required.",
|
||||
"status": 400,
|
||||
"source": {"pointer": "filter[scan_id]"},
|
||||
"code": "required",
|
||||
}
|
||||
]
|
||||
)
|
||||
|
||||
tenant_id = self.request.tenant_id
|
||||
|
||||
regions = list(
|
||||
ComplianceOverview.objects.filter(tenant_id=tenant_id, scan_id=scan_id)
|
||||
.values_list("region", flat=True)
|
||||
.order_by("region")
|
||||
.distinct()
|
||||
)
|
||||
result = {"regions": regions}
|
||||
|
||||
serializer = self.get_serializer(data=result)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
return Response(serializer.data, status=status.HTTP_200_OK)
|
||||
|
||||
|
||||
@extend_schema(tags=["Overview"])
|
||||
@extend_schema_view(
|
||||
|
||||
@@ -50,9 +50,9 @@ class RLSTask(Task):
|
||||
|
||||
tenant_id = kwargs.get("tenant_id")
|
||||
with rls_transaction(tenant_id):
|
||||
APITask.objects.update_or_create(
|
||||
APITask.objects.create(
|
||||
id=task_result_instance.task_id,
|
||||
tenant_id=tenant_id,
|
||||
defaults={"task_runner_task": task_result_instance},
|
||||
task_runner_task=task_result_instance,
|
||||
)
|
||||
return result
|
||||
|
||||
@@ -237,7 +237,6 @@ DJANGO_OUTPUT_S3_AWS_SECRET_ACCESS_KEY = env.str(
|
||||
DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN = env.str("DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN", "")
|
||||
DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION = env.str("DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION", "")
|
||||
|
||||
DJANGO_DELETION_BATCH_SIZE = env.int("DJANGO_DELETION_BATCH_SIZE", 5000)
|
||||
# HTTP Security Headers
|
||||
SECURE_CONTENT_TYPE_NOSNIFF = True
|
||||
X_FRAME_OPTIONS = "DENY"
|
||||
|
||||
@@ -12,8 +12,6 @@ IGNORED_EXCEPTIONS = [
|
||||
"UnauthorizedOperation",
|
||||
"AuthFailure",
|
||||
"InvalidClientTokenId",
|
||||
"AWSInvalidProviderIdError",
|
||||
"InternalServerErrorException",
|
||||
"AccessDenied",
|
||||
"No Shodan API Key", # Shodan Check
|
||||
"RequestLimitExceeded", # For now we don't want to log the RequestLimitExceeded errors
|
||||
@@ -35,10 +33,6 @@ IGNORED_EXCEPTIONS = [
|
||||
"ValidationException",
|
||||
"AWSSecretAccessKeyInvalidError",
|
||||
"InvalidAction",
|
||||
"InvalidRequestException",
|
||||
"RequestExpired",
|
||||
"ConnectionClosedError",
|
||||
"MaxRetryError",
|
||||
"Pool is closed", # The following comes from urllib3: eu-west-1 -- HTTPClientError[126]: An HTTP Client raised an unhandled exception: AWSHTTPSConnectionPool(host='hostname.s3.eu-west-1.amazonaws.com', port=443): Pool is closed.
|
||||
# Authentication Errors from GCP
|
||||
"ClientAuthenticationError",
|
||||
@@ -47,8 +41,6 @@ IGNORED_EXCEPTIONS = [
|
||||
"Permission denied to get service",
|
||||
"API has not been used in project",
|
||||
"HttpError 404 when requesting",
|
||||
"HttpError 403 when requesting",
|
||||
"HttpError 400 when requesting",
|
||||
"GCPNoAccesibleProjectsError",
|
||||
# Authentication Errors from Azure
|
||||
"ClientAuthenticationError",
|
||||
@@ -57,14 +49,11 @@ IGNORED_EXCEPTIONS = [
|
||||
"AzureNotValidClientIdError",
|
||||
"AzureNotValidClientSecretError",
|
||||
"AzureNotValidTenantIdError",
|
||||
"AzureInvalidProviderIdError",
|
||||
"AzureTenantIdAndClientSecretNotBelongingToClientIdError",
|
||||
"AzureTenantIdAndClientIdNotBelongingToClientSecretError",
|
||||
"AzureClientIdAndClientSecretNotBelongingToTenantIdError",
|
||||
"AzureHTTPResponseError",
|
||||
"Error with credentials provided",
|
||||
# AWS Service is not available in a region
|
||||
"EndpointConnectionError",
|
||||
]
|
||||
|
||||
|
||||
|
||||
@@ -655,7 +655,6 @@ def findings_fixture(scans_fixture, resources_fixture):
|
||||
"Description": "test description orange juice",
|
||||
},
|
||||
first_seen_at="2024-01-02T00:00:00Z",
|
||||
muted=True,
|
||||
)
|
||||
|
||||
finding2.add_resources([resource2])
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
from celery.utils.log import get_task_logger
|
||||
from django.db import DatabaseError
|
||||
from django.db import DatabaseError, transaction
|
||||
|
||||
from api.db_router import MainRouter
|
||||
from api.db_utils import batch_delete, rls_transaction
|
||||
@@ -8,12 +8,11 @@ from api.models import Finding, Provider, Resource, Scan, ScanSummary, Tenant
|
||||
logger = get_task_logger(__name__)
|
||||
|
||||
|
||||
def delete_provider(tenant_id: str, pk: str):
|
||||
def delete_provider(pk: str):
|
||||
"""
|
||||
Gracefully deletes an instance of a provider along with its related data.
|
||||
|
||||
Args:
|
||||
tenant_id (str): Tenant ID the resources belong to.
|
||||
pk (str): The primary key of the Provider instance to delete.
|
||||
|
||||
Returns:
|
||||
@@ -22,32 +21,37 @@ def delete_provider(tenant_id: str, pk: str):
|
||||
|
||||
Raises:
|
||||
Provider.DoesNotExist: If no instance with the provided primary key exists.
|
||||
DatabaseError: If any deletion step fails.
|
||||
"""
|
||||
with rls_transaction(tenant_id):
|
||||
instance = Provider.all_objects.get(pk=pk)
|
||||
deletion_summary = {}
|
||||
deletion_steps = [
|
||||
("Scan Summaries", ScanSummary.all_objects.filter(scan__provider=instance)),
|
||||
("Findings", Finding.all_objects.filter(scan__provider=instance)),
|
||||
("Resources", Resource.all_objects.filter(provider=instance)),
|
||||
("Scans", Scan.all_objects.filter(provider=instance)),
|
||||
]
|
||||
instance = Provider.all_objects.get(pk=pk)
|
||||
|
||||
deletion_summary = {}
|
||||
|
||||
deletion_steps = [
|
||||
("Scan Summaries", ScanSummary.all_objects.filter(scan__provider=instance)),
|
||||
("Findings", Finding.all_objects.filter(scan__provider=instance)),
|
||||
("Resources", Resource.all_objects.filter(provider=instance)),
|
||||
("Scans", Scan.all_objects.filter(provider=instance)),
|
||||
]
|
||||
|
||||
for step_name, queryset in deletion_steps:
|
||||
try:
|
||||
_, step_summary = batch_delete(tenant_id, queryset)
|
||||
deletion_summary.update(step_summary)
|
||||
except DatabaseError as db_error:
|
||||
logger.error(f"Error deleting {step_name}: {db_error}")
|
||||
with transaction.atomic():
|
||||
_, step_summary = batch_delete(queryset)
|
||||
deletion_summary.update(step_summary)
|
||||
except DatabaseError as error:
|
||||
logger.error(f"Error deleting {step_name}: {error}")
|
||||
raise
|
||||
|
||||
# Delete the provider itself
|
||||
try:
|
||||
with rls_transaction(tenant_id):
|
||||
with transaction.atomic():
|
||||
_, provider_summary = instance.delete()
|
||||
deletion_summary.update(provider_summary)
|
||||
except DatabaseError as db_error:
|
||||
logger.error(f"Error deleting Provider: {db_error}")
|
||||
deletion_summary.update(provider_summary)
|
||||
except DatabaseError as error:
|
||||
logger.error(f"Error deleting Provider: {error}")
|
||||
raise
|
||||
|
||||
return deletion_summary
|
||||
|
||||
|
||||
@@ -65,8 +69,9 @@ def delete_tenant(pk: str):
|
||||
deletion_summary = {}
|
||||
|
||||
for provider in Provider.objects.using(MainRouter.admin_db).filter(tenant_id=pk):
|
||||
summary = delete_provider(pk, provider.id)
|
||||
deletion_summary.update(summary)
|
||||
with rls_transaction(pk):
|
||||
summary = delete_provider(provider.id)
|
||||
deletion_summary.update(summary)
|
||||
|
||||
Tenant.objects.using(MainRouter.admin_db).filter(id=pk).delete()
|
||||
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
import json
|
||||
import time
|
||||
from copy import deepcopy
|
||||
from datetime import datetime, timezone
|
||||
@@ -7,7 +6,6 @@ from celery.utils.log import get_task_logger
|
||||
from config.settings.celery import CELERY_DEADLOCK_ATTEMPTS
|
||||
from django.db import IntegrityError, OperationalError
|
||||
from django.db.models import Case, Count, IntegerField, Sum, When
|
||||
from tasks.utils import CustomEncoder
|
||||
|
||||
from api.compliance import (
|
||||
PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE,
|
||||
@@ -193,17 +191,6 @@ def perform_prowler_scan(
|
||||
if resource_instance.type != finding.resource_type:
|
||||
resource_instance.type = finding.resource_type
|
||||
updated_fields.append("type")
|
||||
if resource_instance.metadata != finding.resource_metadata:
|
||||
resource_instance.metadata = json.dumps(
|
||||
finding.resource_metadata, cls=CustomEncoder
|
||||
)
|
||||
updated_fields.append("metadata")
|
||||
if resource_instance.details != finding.resource_details:
|
||||
resource_instance.details = finding.resource_details
|
||||
updated_fields.append("details")
|
||||
if resource_instance.partition != finding.partition:
|
||||
resource_instance.partition = finding.partition
|
||||
updated_fields.append("partition")
|
||||
if updated_fields:
|
||||
with rls_transaction(tenant_id):
|
||||
resource_instance.save(update_fields=updated_fields)
|
||||
@@ -280,8 +267,6 @@ def perform_prowler_scan(
|
||||
check_id=finding.check_id,
|
||||
scan=scan_instance,
|
||||
first_seen_at=last_first_seen_at,
|
||||
muted=finding.muted,
|
||||
compliance=finding.compliance,
|
||||
)
|
||||
finding_instance.add_resources([resource_instance])
|
||||
|
||||
@@ -417,21 +402,21 @@ def aggregate_findings(tenant_id: str, scan_id: str):
|
||||
).annotate(
|
||||
fail=Sum(
|
||||
Case(
|
||||
When(status="FAIL", muted=False, then=1),
|
||||
When(status="FAIL", then=1),
|
||||
default=0,
|
||||
output_field=IntegerField(),
|
||||
)
|
||||
),
|
||||
_pass=Sum(
|
||||
Case(
|
||||
When(status="PASS", muted=False, then=1),
|
||||
When(status="PASS", then=1),
|
||||
default=0,
|
||||
output_field=IntegerField(),
|
||||
)
|
||||
),
|
||||
muted_count=Sum(
|
||||
muted=Sum(
|
||||
Case(
|
||||
When(muted=True, then=1),
|
||||
When(status="MUTED", then=1),
|
||||
default=0,
|
||||
output_field=IntegerField(),
|
||||
)
|
||||
@@ -439,63 +424,63 @@ def aggregate_findings(tenant_id: str, scan_id: str):
|
||||
total=Count("id"),
|
||||
new=Sum(
|
||||
Case(
|
||||
When(delta="new", muted=False, then=1),
|
||||
When(delta="new", then=1),
|
||||
default=0,
|
||||
output_field=IntegerField(),
|
||||
)
|
||||
),
|
||||
changed=Sum(
|
||||
Case(
|
||||
When(delta="changed", muted=False, then=1),
|
||||
When(delta="changed", then=1),
|
||||
default=0,
|
||||
output_field=IntegerField(),
|
||||
)
|
||||
),
|
||||
unchanged=Sum(
|
||||
Case(
|
||||
When(delta__isnull=True, muted=False, then=1),
|
||||
When(delta__isnull=True, then=1),
|
||||
default=0,
|
||||
output_field=IntegerField(),
|
||||
)
|
||||
),
|
||||
fail_new=Sum(
|
||||
Case(
|
||||
When(delta="new", status="FAIL", muted=False, then=1),
|
||||
When(delta="new", status="FAIL", then=1),
|
||||
default=0,
|
||||
output_field=IntegerField(),
|
||||
)
|
||||
),
|
||||
fail_changed=Sum(
|
||||
Case(
|
||||
When(delta="changed", status="FAIL", muted=False, then=1),
|
||||
When(delta="changed", status="FAIL", then=1),
|
||||
default=0,
|
||||
output_field=IntegerField(),
|
||||
)
|
||||
),
|
||||
pass_new=Sum(
|
||||
Case(
|
||||
When(delta="new", status="PASS", muted=False, then=1),
|
||||
When(delta="new", status="PASS", then=1),
|
||||
default=0,
|
||||
output_field=IntegerField(),
|
||||
)
|
||||
),
|
||||
pass_changed=Sum(
|
||||
Case(
|
||||
When(delta="changed", status="PASS", muted=False, then=1),
|
||||
When(delta="changed", status="PASS", then=1),
|
||||
default=0,
|
||||
output_field=IntegerField(),
|
||||
)
|
||||
),
|
||||
muted_new=Sum(
|
||||
Case(
|
||||
When(delta="new", muted=True, then=1),
|
||||
When(delta="new", status="MUTED", then=1),
|
||||
default=0,
|
||||
output_field=IntegerField(),
|
||||
)
|
||||
),
|
||||
muted_changed=Sum(
|
||||
Case(
|
||||
When(delta="changed", muted=True, then=1),
|
||||
When(delta="changed", status="MUTED", then=1),
|
||||
default=0,
|
||||
output_field=IntegerField(),
|
||||
)
|
||||
@@ -513,7 +498,7 @@ def aggregate_findings(tenant_id: str, scan_id: str):
|
||||
region=agg["resources__region"],
|
||||
fail=agg["fail"],
|
||||
_pass=agg["_pass"],
|
||||
muted=agg["muted_count"],
|
||||
muted=agg["muted"],
|
||||
total=agg["total"],
|
||||
new=agg["new"],
|
||||
changed=agg["changed"],
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from pathlib import Path
|
||||
from shutil import rmtree
|
||||
|
||||
@@ -22,7 +21,6 @@ from api.db_utils import rls_transaction
|
||||
from api.decorators import set_tenant
|
||||
from api.models import Finding, Provider, Scan, ScanSummary, StateChoices
|
||||
from api.utils import initialize_prowler_provider
|
||||
from api.v1.serializers import ScanTaskSerializer
|
||||
from prowler.lib.outputs.finding import Finding as FindingOutput
|
||||
|
||||
logger = get_task_logger(__name__)
|
||||
@@ -45,10 +43,9 @@ def check_provider_connection_task(provider_id: str):
|
||||
return check_provider_connection(provider_id=provider_id)
|
||||
|
||||
|
||||
@shared_task(
|
||||
base=RLSTask, name="provider-deletion", queue="deletion", autoretry_for=(Exception,)
|
||||
)
|
||||
def delete_provider_task(provider_id: str, tenant_id: str):
|
||||
@shared_task(base=RLSTask, name="provider-deletion", queue="deletion")
|
||||
@set_tenant
|
||||
def delete_provider_task(provider_id: str):
|
||||
"""
|
||||
Task to delete a specific Provider instance.
|
||||
|
||||
@@ -56,7 +53,6 @@ def delete_provider_task(provider_id: str, tenant_id: str):
|
||||
|
||||
Args:
|
||||
provider_id (str): The primary key of the `Provider` instance to be deleted.
|
||||
tenant_id (str): Tenant ID the provider belongs to.
|
||||
|
||||
Returns:
|
||||
tuple: A tuple containing:
|
||||
@@ -64,7 +60,7 @@ def delete_provider_task(provider_id: str, tenant_id: str):
|
||||
- A dictionary with the count of deleted instances per model,
|
||||
including related models if cascading deletes were triggered.
|
||||
"""
|
||||
return delete_provider(tenant_id=tenant_id, pk=provider_id)
|
||||
return delete_provider(pk=provider_id)
|
||||
|
||||
|
||||
@shared_task(base=RLSTask, name="scan-perform", queue="scans")
|
||||
@@ -130,43 +126,6 @@ def perform_scheduled_scan_task(self, tenant_id: str, provider_id: str):
|
||||
periodic_task_instance = PeriodicTask.objects.get(
|
||||
name=f"scan-perform-scheduled-{provider_id}"
|
||||
)
|
||||
|
||||
executed_scan = Scan.objects.filter(
|
||||
tenant_id=tenant_id,
|
||||
provider_id=provider_id,
|
||||
task__task_runner_task__task_id=task_id,
|
||||
).order_by("completed_at")
|
||||
|
||||
if (
|
||||
Scan.objects.filter(
|
||||
tenant_id=tenant_id,
|
||||
provider_id=provider_id,
|
||||
trigger=Scan.TriggerChoices.SCHEDULED,
|
||||
state=StateChoices.EXECUTING,
|
||||
scheduler_task_id=periodic_task_instance.id,
|
||||
scheduled_at__date=datetime.now(timezone.utc).date(),
|
||||
).exists()
|
||||
or executed_scan.exists()
|
||||
):
|
||||
# Duplicated task execution due to visibility timeout or scan is already running
|
||||
logger.warning(f"Duplicated scheduled scan for provider {provider_id}.")
|
||||
try:
|
||||
affected_scan = executed_scan.first()
|
||||
if not affected_scan:
|
||||
raise ValueError(
|
||||
"Error retrieving affected scan details after detecting duplicated scheduled "
|
||||
"scan."
|
||||
)
|
||||
# Return the affected scan details to avoid losing data
|
||||
serializer = ScanTaskSerializer(instance=affected_scan)
|
||||
except Exception as duplicated_scan_exception:
|
||||
logger.error(
|
||||
f"Duplicated scheduled scan for provider {provider_id}. Error retrieving affected scan details: "
|
||||
f"{str(duplicated_scan_exception)}"
|
||||
)
|
||||
raise duplicated_scan_exception
|
||||
return serializer.data
|
||||
|
||||
next_scan_datetime = get_next_execution_datetime(task_id, provider_id)
|
||||
scan_instance, _ = Scan.objects.get_or_create(
|
||||
tenant_id=tenant_id,
|
||||
@@ -174,11 +133,7 @@ def perform_scheduled_scan_task(self, tenant_id: str, provider_id: str):
|
||||
trigger=Scan.TriggerChoices.SCHEDULED,
|
||||
state__in=(StateChoices.SCHEDULED, StateChoices.AVAILABLE),
|
||||
scheduler_task_id=periodic_task_instance.id,
|
||||
defaults={
|
||||
"state": StateChoices.SCHEDULED,
|
||||
"name": "Daily scheduled scan",
|
||||
"scheduled_at": next_scan_datetime - timedelta(days=1),
|
||||
},
|
||||
defaults={"state": StateChoices.SCHEDULED},
|
||||
)
|
||||
|
||||
scan_instance.task_id = task_id
|
||||
@@ -219,7 +174,7 @@ def perform_scan_summary_task(tenant_id: str, scan_id: str):
|
||||
return aggregate_findings(tenant_id=tenant_id, scan_id=scan_id)
|
||||
|
||||
|
||||
@shared_task(name="tenant-deletion", queue="deletion", autoretry_for=(Exception,))
|
||||
@shared_task(name="tenant-deletion", queue="deletion")
|
||||
def delete_tenant_task(tenant_id: str):
|
||||
return delete_tenant(pk=tenant_id)
|
||||
|
||||
@@ -246,11 +201,6 @@ def generate_outputs(scan_id: str, provider_id: str, tenant_id: str):
|
||||
scan_id (str): The scan identifier.
|
||||
provider_id (str): The provider_id id to be used in generating outputs.
|
||||
"""
|
||||
# Check if the scan has findings
|
||||
if not ScanSummary.objects.filter(scan_id=scan_id).exists():
|
||||
logger.info(f"No findings found for scan {scan_id}")
|
||||
return {"upload": False}
|
||||
|
||||
# Initialize the prowler provider
|
||||
prowler_provider = initialize_prowler_provider(Provider.objects.get(id=provider_id))
|
||||
|
||||
|
||||
@@ -9,19 +9,17 @@ from api.models import Provider, Tenant
|
||||
class TestDeleteProvider:
|
||||
def test_delete_provider_success(self, providers_fixture):
|
||||
instance = providers_fixture[0]
|
||||
tenant_id = str(instance.tenant_id)
|
||||
result = delete_provider(tenant_id, instance.id)
|
||||
result = delete_provider(instance.id)
|
||||
|
||||
assert result
|
||||
with pytest.raises(ObjectDoesNotExist):
|
||||
Provider.objects.get(pk=instance.id)
|
||||
|
||||
def test_delete_provider_does_not_exist(self, tenants_fixture):
|
||||
tenant_id = str(tenants_fixture[0].id)
|
||||
def test_delete_provider_does_not_exist(self):
|
||||
non_existent_pk = "babf6796-cfcc-4fd3-9dcf-88d012247645"
|
||||
|
||||
with pytest.raises(ObjectDoesNotExist):
|
||||
delete_provider(tenant_id, non_existent_pk)
|
||||
delete_provider(non_existent_pk)
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
import json
|
||||
import uuid
|
||||
from unittest.mock import MagicMock, patch
|
||||
|
||||
@@ -8,7 +7,6 @@ from tasks.jobs.scan import (
|
||||
_store_resources,
|
||||
perform_prowler_scan,
|
||||
)
|
||||
from tasks.utils import CustomEncoder
|
||||
|
||||
from api.models import (
|
||||
Finding,
|
||||
@@ -109,13 +107,7 @@ class TestPerformScan:
|
||||
finding.service_name = "service_name"
|
||||
finding.resource_type = "resource_type"
|
||||
finding.resource_tags = {"tag1": "value1", "tag2": "value2"}
|
||||
finding.muted = False
|
||||
finding.raw = {}
|
||||
finding.resource_metadata = {"test": "metadata"}
|
||||
finding.resource_details = {"details": "test"}
|
||||
finding.partition = "partition"
|
||||
finding.muted = True
|
||||
finding.compliance = {"compliance1": "PASS"}
|
||||
|
||||
# Mock the ProwlerScan instance
|
||||
mock_prowler_scan_instance = MagicMock()
|
||||
@@ -153,8 +145,6 @@ class TestPerformScan:
|
||||
assert scan_finding.severity == finding.severity
|
||||
assert scan_finding.check_id == finding.check_id
|
||||
assert scan_finding.raw_result == finding.raw
|
||||
assert scan_finding.muted
|
||||
assert scan_finding.compliance == finding.compliance
|
||||
|
||||
assert scan_resource.tenant == tenant
|
||||
assert scan_resource.uid == finding.resource_uid
|
||||
@@ -162,11 +152,6 @@ class TestPerformScan:
|
||||
assert scan_resource.service == finding.service_name
|
||||
assert scan_resource.type == finding.resource_type
|
||||
assert scan_resource.name == finding.resource_name
|
||||
assert scan_resource.metadata == json.dumps(
|
||||
finding.resource_metadata, cls=CustomEncoder
|
||||
)
|
||||
assert scan_resource.details == f"{finding.resource_details}"
|
||||
assert scan_resource.partition == finding.partition
|
||||
|
||||
# Assert that the resource tags have been created and associated
|
||||
tags = scan_resource.tags.all()
|
||||
|
||||
@@ -1,32 +1,9 @@
|
||||
import json
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from enum import Enum
|
||||
|
||||
from django_celery_beat.models import PeriodicTask
|
||||
from django_celery_results.models import TaskResult
|
||||
|
||||
|
||||
class CustomEncoder(json.JSONEncoder):
|
||||
def default(self, o):
|
||||
# Enum serialization
|
||||
if isinstance(o, Enum):
|
||||
return o.value
|
||||
# Datetime and timedelta serialization
|
||||
if isinstance(o, datetime):
|
||||
return o.isoformat(timespec="seconds")
|
||||
if isinstance(o, timedelta):
|
||||
return o.total_seconds()
|
||||
|
||||
# Custom object serialization
|
||||
try:
|
||||
return super().default(o)
|
||||
except TypeError:
|
||||
try:
|
||||
return o.__dict__
|
||||
except AttributeError:
|
||||
return str(o)
|
||||
|
||||
|
||||
def get_next_execution_datetime(task_id: int, provider_id: str) -> datetime:
|
||||
task_instance = TaskResult.objects.get(task_id=task_id)
|
||||
try:
|
||||
|
||||
@@ -1,24 +0,0 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_format3
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_format3(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ID"
|
||||
)
|
||||
@@ -59,7 +59,7 @@ export AZURE_CLIENT_SECRET="XXXXXXX"
|
||||
```
|
||||
|
||||
If you try to execute Prowler with the `--sp-env-auth` flag and those variables are empty or not exported, the execution is going to fail.
|
||||
Follow the instructions in the [Create Prowler Service Principal](../tutorials/azure/create-prowler-service-principal.md#how-to-create-prowler-service-principal-application) section to create a service principal.
|
||||
Follow the instructions in the [Create Prowler Service Principal](../tutorials/azure/create-prowler-service-principal.md#how-to-create-prowler-service-principal) section to create a service principal.
|
||||
|
||||
### AZ CLI / Browser / Managed Identity authentication
|
||||
|
||||
@@ -79,7 +79,7 @@ Prowler for Azure needs two types of permission scopes to be set:
|
||||
???+ note
|
||||
Please, notice that the field `assignableScopes` in the JSON custom role file must be changed to be the subscription or management group where the role is going to be assigned. The valid formats for the field are `/subscriptions/<subscription-id>` or `/providers/Microsoft.Management/managementGroups/<management-group-id>`.
|
||||
|
||||
To assign the permissions, follow the instructions in the [Microsoft Entra ID permissions](../tutorials/azure/create-prowler-service-principal.md#assigning-the-proper-permissions) section and the [Azure subscriptions permissions](../tutorials/azure/subscriptions.md#assign-the-appropriate-permissions-to-the-identity-that-is-going-to-be-assumed-by-prowler) section, respectively.
|
||||
To assign the permissions, follow the instructions in the [Microsoft Entra ID permissions](../tutorials/azure/create-prowler-service-principal.md#assigning-the-proper-permissions) section and the [Azure subscriptions permissions](../tutorials/azure/subscriptions.md#assigning-proper-permissions) section, respectively.
|
||||
|
||||
#### Checks that require ProwlerRole
|
||||
|
||||
|
||||
@@ -136,7 +136,7 @@ Prowler App can be installed in different ways, depending on your environment:
|
||||
|
||||
### Prowler CLI Installation
|
||||
|
||||
Prowler is available as a project in [PyPI](https://pypi.org/project/prowler/), thus can be installed as Python package with `Python >= 3.9, <= 3.12`:
|
||||
Prowler is available as a project in [PyPI](https://pypi.org/project/prowler/), thus can be installed as Python package with `Python >= 3.9`:
|
||||
|
||||
=== "pipx"
|
||||
|
||||
@@ -144,7 +144,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler/),
|
||||
|
||||
_Requirements_:
|
||||
|
||||
* `Python >= 3.9, <= 3.12`
|
||||
* `Python >= 3.9`
|
||||
* `pipx` installed: [pipx installation](https://pipx.pypa.io/stable/installation/).
|
||||
* AWS, GCP, Azure and/or Kubernetes credentials
|
||||
|
||||
@@ -168,7 +168,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler/),
|
||||
|
||||
_Requirements_:
|
||||
|
||||
* `Python >= 3.9, <= 3.12`
|
||||
* `Python >= 3.9`
|
||||
* `Python pip >= 21.0.0`
|
||||
* AWS, GCP, Azure, Microsoft365 and/or Kubernetes credentials
|
||||
|
||||
@@ -228,7 +228,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler/),
|
||||
|
||||
_Requirements_:
|
||||
|
||||
* `Python >= 3.9, <= 3.12`
|
||||
* `Python >= 3.9`
|
||||
* AWS, GCP, Azure and/or Kubernetes credentials
|
||||
|
||||
_Commands_:
|
||||
@@ -244,8 +244,8 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler/),
|
||||
|
||||
_Requirements_:
|
||||
|
||||
* `Ubuntu 23.04` or above, if you are using an older version of Ubuntu check [pipx installation](https://docs.prowler.com/projects/prowler-open-source/en/latest/#__tabbed_1_1) and ensure you have `Python >= 3.9, <= 3.12`.
|
||||
* `Python >= 3.9, <= 3.12`
|
||||
* `Ubuntu 23.04` or above, if you are using an older version of Ubuntu check [pipx installation](https://docs.prowler.com/projects/prowler-open-source/en/latest/#__tabbed_1_1) and ensure you have `Python >= 3.9`.
|
||||
* `Python >= 3.9`
|
||||
* AWS, GCP, Azure and/or Kubernetes credentials
|
||||
|
||||
_Commands_:
|
||||
|
||||
@@ -1,214 +0,0 @@
|
||||
# Getting Started with AWS on Prowler Cloud
|
||||
|
||||
<iframe width="560" height="380" src="https://www.youtube-nocookie.com/embed/RPgIWOCERzY" title="Prowler Cloud Onboarding AWS" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="1"></iframe>
|
||||
|
||||
Set up your AWS account to enable security scanning using Prowler Cloud.
|
||||
|
||||
## Requirements
|
||||
|
||||
To configure your AWS account, you’ll need:
|
||||
|
||||
1. Access to Prowler Cloud
|
||||
2. Properly configured AWS credentials (either static or via an assumed IAM role)
|
||||
|
||||
---
|
||||
|
||||
## Step 1: Get Your AWS Account ID
|
||||
|
||||
1. Log in to the [AWS Console](https://console.aws.amazon.com)
|
||||
2. Locate your AWS account ID in the top-right dropdown menu
|
||||
|
||||

|
||||
|
||||
---
|
||||
|
||||
## Step 2: Access Prowler Cloud
|
||||
|
||||
1. Navigate to [Prowler Cloud](https://cloud.prowler.com/)
|
||||
2. Go to `Configuration` > `Cloud Providers`
|
||||
|
||||

|
||||
|
||||
3. Click `Add Cloud Provider`
|
||||
|
||||

|
||||
|
||||
4. Select `Amazon Web Services`
|
||||
|
||||

|
||||
|
||||
5. Enter your AWS Account ID and optionally provide a friendly alias
|
||||
|
||||

|
||||
|
||||
6. Choose your preferred authentication method (next step)
|
||||
|
||||

|
||||
|
||||
---
|
||||
|
||||
## Step 3: Set Up AWS Authentication
|
||||
|
||||
Before proceeding, choose your preferred authentication mode:
|
||||
|
||||
Credentials
|
||||
|
||||
* Quick scan as current user ✅
|
||||
* No extra setup ✅
|
||||
* Credentials time out ❌
|
||||
|
||||
Assumed Role
|
||||
|
||||
* Preferred Setup ✅
|
||||
* Permanent Credentials ✅
|
||||
* Requires access to create role ❌
|
||||
|
||||
---
|
||||
|
||||
### 🔐 Assume Role (Recommended)
|
||||
|
||||

|
||||
|
||||
This method grants permanent access and is the recommended setup for production environments.
|
||||
|
||||
=== "CloudFormation"
|
||||
|
||||
1. Download the [Prowler Scan Role Template](https://raw.githubusercontent.com/prowler-cloud/prowler/refs/heads/master/permissions/templates/cloudformation/prowler-scan-role.yml)
|
||||
|
||||

|
||||
|
||||

|
||||
|
||||
2. Open the [AWS Console](https://console.aws.amazon.com), search for **CloudFormation**
|
||||
|
||||

|
||||
|
||||
3. Go to **Stacks** and click `Create stack` > `With new resources (standard)`
|
||||
|
||||

|
||||
|
||||
4. In **Specify Template**, choose `Upload a template file` and select the downloaded file
|
||||
|
||||

|
||||

|
||||
|
||||
5. Click `Next`, provide a stack name and the **External ID** shown in the Prowler Cloud setup screen
|
||||
|
||||

|
||||

|
||||
|
||||
6. Acknowledge the IAM resource creation warning and proceed
|
||||
|
||||

|
||||
|
||||
7. Click `Submit` to deploy the stack
|
||||
|
||||

|
||||
|
||||
=== "Terraform"
|
||||
|
||||
To provision the scan role using Terraform:
|
||||
|
||||
1. Run the following commands:
|
||||
|
||||
```bash
|
||||
terraform init
|
||||
terraform plan
|
||||
terraform apply
|
||||
```
|
||||
|
||||
2. During `plan` and `apply`, you will be prompted for the **External ID**, which is available in the Prowler Cloud UI:
|
||||
|
||||

|
||||
|
||||
> 💡 Note: Terraform will use the AWS credentials of your default profile.
|
||||
|
||||
---
|
||||
|
||||
### Finish Setup with Assume Role
|
||||
|
||||
8. Once the role is created, go to the **IAM Console**, click on the `ProwlerScan` role to open its details:
|
||||
|
||||

|
||||
|
||||
9. Copy the **Role ARN**
|
||||
|
||||

|
||||
|
||||
10. Paste the ARN into the corresponding field in Prowler Cloud
|
||||
|
||||

|
||||
|
||||
11. Click `Next`, then `Launch Scan`
|
||||
|
||||

|
||||

|
||||
|
||||
---
|
||||
|
||||
### 🔑 Credentials (Static Access Keys)
|
||||
|
||||
You can also configure your AWS account using static credentials (not recommended for long-term use):
|
||||
|
||||

|
||||
|
||||
=== "Long term credentials"
|
||||
|
||||
1. Go to the [AWS Console](https://console.aws.amazon.com), open **CloudShell**
|
||||
|
||||

|
||||
|
||||
2. Run:
|
||||
|
||||
```bash
|
||||
aws iam create-access-key
|
||||
```
|
||||
|
||||
3. Copy the output containing:
|
||||
|
||||
- `AccessKeyId`
|
||||
- `SecretAccessKey`
|
||||
|
||||

|
||||
|
||||
> ⚠️ Save these credentials securely and paste them into the Prowler Cloud setup screen.
|
||||
|
||||
=== "Short term credentials (Recommended)"
|
||||
|
||||
You can use your [AWS Access Portal](https://docs.aws.amazon.com/singlesignon/latest/userguide/howtogetcredentials.html) or the CLI:
|
||||
|
||||
1. Retrieve short-term credentials for the IAM identity using this command:
|
||||
|
||||
```bash
|
||||
aws sts get-session-token --duration-seconds 900
|
||||
```
|
||||
|
||||
???+ note
|
||||
Check the aws documentation [here](https://docs.aws.amazon.com/IAM/latest/UserGuide/sts_example_sts_GetSessionToken_section.html)
|
||||
|
||||
2. Copy the output containing:
|
||||
|
||||
- `AccessKeyId`
|
||||
- `SecretAccessKey`
|
||||
|
||||
> Sample output:
|
||||
```json
|
||||
{
|
||||
"Credentials": {
|
||||
"AccessKeyId": "ASIAIOSFODNN7EXAMPLE",
|
||||
"SecretAccessKey": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYzEXAMPLEKEY",
|
||||
"SessionToken": "AQoEXAMPLEH4aoAH0gNCAPyJxz4BlCFFxWNE1OPTgk5TthT+FvwqnKwRcOIfrRh3c/LTo6UDdyJwOOvEVPvLXCrrrUtdnniCEXAMPLE/IvU1dYUg2RVAJBanLiHb4IgRmpRV3zrkuWJOgQs8IZZaIv2BXIa2R4OlgkBN9bkUDNCJiBeb/AXlzBBko7b15fjrBs2+cTQtpZ3CYWFXG8C5zqx37wnOE49mRl/+OtkIKGO7fAE",
|
||||
"Expiration": "2020-05-19T18:06:10+00:00"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
> ⚠️ Save these credentials securely and paste them into the Prowler Cloud setup screen.
|
||||
|
||||
Complete the form in Prowler Cloud and click `Next`
|
||||
|
||||

|
||||
|
||||
Click `Launch Scan`
|
||||
|
||||

|
||||
|
Before Width: | Height: | Size: 120 KiB |
|
Before Width: | Height: | Size: 182 KiB |
|
Before Width: | Height: | Size: 45 KiB |
|
Before Width: | Height: | Size: 10 KiB |
|
Before Width: | Height: | Size: 53 KiB |
|
Before Width: | Height: | Size: 172 KiB |
|
Before Width: | Height: | Size: 149 KiB |
|
Before Width: | Height: | Size: 98 KiB |
|
Before Width: | Height: | Size: 127 KiB |
|
Before Width: | Height: | Size: 453 KiB |
|
Before Width: | Height: | Size: 78 KiB |
|
Before Width: | Height: | Size: 401 KiB |
|
Before Width: | Height: | Size: 278 KiB |
|
Before Width: | Height: | Size: 19 KiB |
|
Before Width: | Height: | Size: 20 KiB |
|
Before Width: | Height: | Size: 528 KiB |
|
Before Width: | Height: | Size: 141 KiB |
|
Before Width: | Height: | Size: 91 KiB |
|
Before Width: | Height: | Size: 416 KiB |
|
Before Width: | Height: | Size: 245 KiB |
|
Before Width: | Height: | Size: 117 KiB |
|
Before Width: | Height: | Size: 119 KiB |
|
Before Width: | Height: | Size: 429 KiB |
|
Before Width: | Height: | Size: 283 KiB |
|
Before Width: | Height: | Size: 113 KiB |
|
Before Width: | Height: | Size: 112 KiB |
@@ -1,171 +0,0 @@
|
||||
# Getting Started with Azure on Prowler Cloud
|
||||
|
||||
<iframe width="560" height="380" src="https://www.youtube-nocookie.com/embed/v1as8vTFlMg" title="Prowler Cloud Onboarding Azure" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="1"></iframe>
|
||||
|
||||
Set up your Azure subscription to enable security scanning using Prowler Cloud.
|
||||
|
||||
## Requirements
|
||||
|
||||
To configure your Azure subscription, you’ll need:
|
||||
|
||||
1. Get the `Subscription ID`
|
||||
2. Access to Prowler Cloud
|
||||
3. Configure authentication in Azure:
|
||||
|
||||
3.1 Create a Service Principal
|
||||
|
||||
3.2 Assign required permissions
|
||||
|
||||
3.3 Assign permissions at the subscription level
|
||||
|
||||
4. Add the credentials to Prowler Cloud
|
||||
|
||||
---
|
||||
|
||||
## Step 1: Get the Subscription ID
|
||||
|
||||
1. Go to the [Azure Portal](https://portal.azure.com/#home) and search for `Subscriptions`
|
||||
2. Locate and copy your Subscription ID
|
||||
|
||||

|
||||

|
||||
|
||||
---
|
||||
|
||||
## Step 2: Access Prowler Cloud
|
||||
|
||||
1. Go to [Prowler Cloud](https://cloud.prowler.com/)
|
||||
2. Navigate to `Configuration` > `Cloud Providers`
|
||||
|
||||

|
||||
|
||||
3. Click on `Add Cloud Provider`
|
||||
|
||||

|
||||
|
||||
4. Select `Microsoft Azure`
|
||||
|
||||

|
||||
|
||||
5. Add the Subscription ID and an optional alias, then click `Next`
|
||||
|
||||

|
||||
|
||||
---
|
||||
|
||||
## Step 3: Configure the Azure Subscription
|
||||
|
||||
### Create the Service Principal
|
||||
|
||||
A Service Principal is required to grant Prowler the necessary privileges.
|
||||
|
||||
1. Access **Microsoft Entra ID**
|
||||
|
||||

|
||||
|
||||
2. Navigate to `Manage` > `App registrations`
|
||||
|
||||

|
||||
|
||||
3. Click `+ New registration`, complete the form, and click `Register`
|
||||
|
||||

|
||||
|
||||
4. Go to `Certificates & secrets` > `+ New client secret`
|
||||
|
||||

|
||||

|
||||
|
||||
5. Fill in the required fields and click `Add`, then copy the generated value
|
||||
|
||||
| Value | Description |
|
||||
|-------|-------------|
|
||||
| Client ID | Application ID |
|
||||
| Client Secret | AZURE_CLIENT_SECRET |
|
||||
| Tenant ID | Azure Active Directory tenant ID |
|
||||
|
||||
---
|
||||
|
||||
### Assign Required API Permissions
|
||||
|
||||
Assign the following Microsoft Graph permissions:
|
||||
|
||||
- Directory.Read.All
|
||||
|
||||
- Policy.Read.All
|
||||
|
||||
- UserAuthenticationMethod.Read.All (optional, for MFA checks)
|
||||
|
||||
1. Go to your App Registration > `API permissions`
|
||||
|
||||

|
||||
|
||||
2. Click `+ Add a permission` > `Microsoft Graph` > `Application permissions`
|
||||
|
||||

|
||||

|
||||
|
||||
3. Search and select:
|
||||
|
||||
- `Directory.Read.All`
|
||||
- `Policy.Read.All`
|
||||
- `UserAuthenticationMethod.Read.All`
|
||||
|
||||

|
||||
|
||||
4. Click `Add permissions`, then grant admin consent
|
||||
|
||||

|
||||
|
||||
---
|
||||
|
||||
### Assign Permissions at the Subscription Level
|
||||
|
||||
1. Download the [Prowler Azure Custom Role](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-azure-custom-role.json)
|
||||
|
||||

|
||||
|
||||
2. Modify `assignableScopes` to match your Subscription ID (e.g. `/subscriptions/xxxx-xxxx-xxxx-xxxx`)
|
||||
|
||||
3. Go to your Azure Subscription > `Access control (IAM)`
|
||||
|
||||

|
||||
|
||||
4. Click `+ Add` > `Add custom role`, choose "Start from JSON" and upload the modified file
|
||||
|
||||

|
||||
|
||||
5. Click `Review + Create` to finish
|
||||
|
||||

|
||||
|
||||
6. Return to `Access control (IAM)` > `+ Add` > `Add role assignment`
|
||||
|
||||
- Assign the `Reader` role
|
||||
- Then repeat and assign the custom `ProwlerRole`
|
||||
|
||||

|
||||
|
||||
---
|
||||
|
||||
## Step 4: Add Credentials to Prowler Cloud
|
||||
|
||||
1. Go to your App Registration overview and copy the `Client ID` and `Tenant ID`
|
||||
|
||||

|
||||
|
||||
2. Go to Prowler Cloud and paste:
|
||||
|
||||
- `Client ID`
|
||||
- `Tenant ID`
|
||||
- `AZURE_CLIENT_SECRET` from earlier
|
||||
|
||||

|
||||
|
||||
3. Click `Next`
|
||||
|
||||

|
||||
|
||||
4. Click `Launch Scan`
|
||||
|
||||

|
||||
|
Before Width: | Height: | Size: 212 KiB |
|
Before Width: | Height: | Size: 132 KiB |
|
Before Width: | Height: | Size: 69 KiB |
|
Before Width: | Height: | Size: 69 KiB |
|
Before Width: | Height: | Size: 71 KiB |
|
Before Width: | Height: | Size: 55 KiB |
|
Before Width: | Height: | Size: 118 KiB |
|
Before Width: | Height: | Size: 65 KiB |
|
Before Width: | Height: | Size: 216 KiB |
|
Before Width: | Height: | Size: 126 KiB |
|
Before Width: | Height: | Size: 82 KiB |
|
Before Width: | Height: | Size: 69 KiB |
|
Before Width: | Height: | Size: 66 KiB |
|
Before Width: | Height: | Size: 11 KiB |
|
Before Width: | Height: | Size: 5.4 KiB |
|
Before Width: | Height: | Size: 82 KiB |
|
Before Width: | Height: | Size: 202 KiB |
|
Before Width: | Height: | Size: 81 KiB |
|
Before Width: | Height: | Size: 523 KiB |
|
Before Width: | Height: | Size: 70 KiB |
|
Before Width: | Height: | Size: 31 KiB |
|
Before Width: | Height: | Size: 16 KiB |
|
Before Width: | Height: | Size: 97 KiB |
|
Before Width: | Height: | Size: 119 KiB |
|
Before Width: | Height: | Size: 127 KiB |
|
Before Width: | Height: | Size: 91 KiB |
|
Before Width: | Height: | Size: 38 KiB |
|
Before Width: | Height: | Size: 12 KiB |
|
Before Width: | Height: | Size: 7.4 KiB |
|
Before Width: | Height: | Size: 68 KiB |
|
Before Width: | Height: | Size: 65 KiB |
|
Before Width: | Height: | Size: 120 KiB |
|
Before Width: | Height: | Size: 135 KiB |
|
Before Width: | Height: | Size: 60 KiB |