Compare commits
554 Commits
5.7.1
...
M365-testi
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
221310ba94 | ||
|
|
33135f6c9a | ||
|
|
4aaacc9e16 | ||
|
|
165d09768f | ||
|
|
6b8af51c23 | ||
|
|
e40591a25f | ||
|
|
d6ac87341b | ||
|
|
d431af6186 | ||
|
|
7b3b728207 | ||
|
|
cfab4dcef5 | ||
|
|
cd48037670 | ||
|
|
ab7e352029 | ||
|
|
79ce899812 | ||
|
|
29d4159b8c | ||
|
|
aaf7188365 | ||
|
|
6cbba3c5b1 | ||
|
|
9510b4c3ef | ||
|
|
c7754c4331 | ||
|
|
7800320189 | ||
|
|
972c88e421 | ||
|
|
3ee420769a | ||
|
|
f8d9873655 | ||
|
|
3f5508847a | ||
|
|
ed54b39211 | ||
|
|
a6dd97b6bc | ||
|
|
b6e1551397 | ||
|
|
c8567b5d2b | ||
|
|
361269370a | ||
|
|
3d9d118d2c | ||
|
|
b50a91d3e6 | ||
|
|
0843c49475 | ||
|
|
3fe65b7a03 | ||
|
|
68369ca54f | ||
|
|
63f4b4f7d4 | ||
|
|
754a0748da | ||
|
|
aa3182ebc5 | ||
|
|
32d27df0ba | ||
|
|
6439f0a5f3 | ||
|
|
19476632ff | ||
|
|
d4c12e4632 | ||
|
|
52bd48168f | ||
|
|
c0d935e232 | ||
|
|
24dfd47329 | ||
|
|
fbae338689 | ||
|
|
186fd88f8c | ||
|
|
14ff34c00a | ||
|
|
a66fa394d3 | ||
|
|
931766fe08 | ||
|
|
c134914896 | ||
|
|
25dac080a5 | ||
|
|
910d39eee4 | ||
|
|
d604ae5569 | ||
|
|
42f46b0fb1 | ||
|
|
abb5864224 | ||
|
|
2e2a2bd89a | ||
|
|
f8ee841921 | ||
|
|
ceda8c76d2 | ||
|
|
afe0b7443f | ||
|
|
9b773897d2 | ||
|
|
d6ec4c2c96 | ||
|
|
14ef169e99 | ||
|
|
22141f9706 | ||
|
|
a5c6fee5b4 | ||
|
|
d3a5a5c0a1 | ||
|
|
5d81869de4 | ||
|
|
73ebf95d89 | ||
|
|
9f4574f4ff | ||
|
|
cb239b20ab | ||
|
|
3ef79588b4 | ||
|
|
61000e386b | ||
|
|
53cb57901f | ||
|
|
993ff4d78e | ||
|
|
8fb10fbbf7 | ||
|
|
11e834f639 | ||
|
|
62bf2fbb9c | ||
|
|
e57930d6c2 | ||
|
|
e0c417a466 | ||
|
|
b55f8efed1 | ||
|
|
7cbc60d977 | ||
|
|
5b7912b558 | ||
|
|
57fca3e54d | ||
|
|
e31c27b123 | ||
|
|
74f1da818e | ||
|
|
910cfa601b | ||
|
|
fe321c3f8a | ||
|
|
43de0d405f | ||
|
|
ac6ed31c8e | ||
|
|
9d47437de4 | ||
|
|
eb7a62ff77 | ||
|
|
67bc16b46d | ||
|
|
8552a578a0 | ||
|
|
a5d277e045 | ||
|
|
6dbf2ac606 | ||
|
|
b1569ac2f3 | ||
|
|
3d0145b522 | ||
|
|
44174526d6 | ||
|
|
0fd395ea83 | ||
|
|
5e9d4a80a1 | ||
|
|
e4d234fe03 | ||
|
|
3202184718 | ||
|
|
41e576f4f1 | ||
|
|
d8dce07019 | ||
|
|
2b0a3144c7 | ||
|
|
62fbce0b5e | ||
|
|
5a59bb335c | ||
|
|
2719991630 | ||
|
|
6a3b8c4674 | ||
|
|
191fbf0177 | ||
|
|
228dd2952a | ||
|
|
97db38aa25 | ||
|
|
dc953a6e22 | ||
|
|
51e796a48d | ||
|
|
024f1425df | ||
|
|
a7ed610da9 | ||
|
|
7ba99f22cd | ||
|
|
b8ce09ec34 | ||
|
|
c243110a49 | ||
|
|
ee27636f32 | ||
|
|
f2f41c9c44 | ||
|
|
9312890e6a | ||
|
|
9578281b4f | ||
|
|
08690068fc | ||
|
|
e06a33de84 | ||
|
|
6a3db10fda | ||
|
|
bbed445efa | ||
|
|
9d65fb0bf2 | ||
|
|
34f03ca110 | ||
|
|
87c038f0c2 | ||
|
|
b3014f03b1 | ||
|
|
d39598c9fc | ||
|
|
5ea9106259 | ||
|
|
bcc0b59de1 | ||
|
|
5d6ed640f0 | ||
|
|
dd1cc2d025 | ||
|
|
52e5cc23e4 | ||
|
|
76a8e2be1f | ||
|
|
d989425490 | ||
|
|
1e324b7ed2 | ||
|
|
e68aa62f94 | ||
|
|
332b98a1ab | ||
|
|
dd05ef7974 | ||
|
|
d6862766d3 | ||
|
|
f52d005e2d | ||
|
|
bf475234a5 | ||
|
|
cd5985c056 | ||
|
|
ce33dbf823 | ||
|
|
0a9d0688a7 | ||
|
|
24784f2ce5 | ||
|
|
7a1e611b88 | ||
|
|
3073150008 | ||
|
|
9923def4cb | ||
|
|
a7f612303f | ||
|
|
64c2a2217a | ||
|
|
4689d7a952 | ||
|
|
87cd143967 | ||
|
|
e37fd05d58 | ||
|
|
acc708bda5 | ||
|
|
c7460bb69c | ||
|
|
84b273dab9 | ||
|
|
bb7ce2157e | ||
|
|
07b9e1d3a4 | ||
|
|
96a879d761 | ||
|
|
283127c3f4 | ||
|
|
beeee80a0b | ||
|
|
06b62826b4 | ||
|
|
d0736af209 | ||
|
|
716c8c1a5f | ||
|
|
e6cdda1bd9 | ||
|
|
2747a633bc | ||
|
|
74118f5cfe | ||
|
|
598bdf28bb | ||
|
|
d75f681c87 | ||
|
|
c7956ede6a | ||
|
|
64f5a69e84 | ||
|
|
bfb15c34b8 | ||
|
|
638b3ac0cd | ||
|
|
9d6147a037 | ||
|
|
802c786ac2 | ||
|
|
c8be8dbd9a | ||
|
|
7053b2bb37 | ||
|
|
447bf832cd | ||
|
|
7c4571b55e | ||
|
|
eb7c16aba5 | ||
|
|
b09e83b171 | ||
|
|
bb149a30a7 | ||
|
|
d5be35af49 | ||
|
|
f6aa56d92b | ||
|
|
6a4df15c47 | ||
|
|
72de5fdb1b | ||
|
|
a7f55d06af | ||
|
|
97da78d4e7 | ||
|
|
c4f6161c73 | ||
|
|
db7ffea24d | ||
|
|
489b5abf82 | ||
|
|
3a55c2ee07 | ||
|
|
64d866271c | ||
|
|
1ab2a80eab | ||
|
|
89d4c521ba | ||
|
|
f2e19d377a | ||
|
|
2b7b887b87 | ||
|
|
44c70b5d01 | ||
|
|
7514484c42 | ||
|
|
9594c4c99f | ||
|
|
56445c9753 | ||
|
|
07419fd5e1 | ||
|
|
2e4dd12b41 | ||
|
|
fed2046c49 | ||
|
|
db79db4786 | ||
|
|
6f027e3c57 | ||
|
|
bdb877009f | ||
|
|
6564ec1ff5 | ||
|
|
443dc067b3 | ||
|
|
6221650c5f | ||
|
|
034d0fd1f4 | ||
|
|
e617ff0460 | ||
|
|
4b1ed607a7 | ||
|
|
137365a670 | ||
|
|
1891a1b24f | ||
|
|
e57e070866 | ||
|
|
66998cd1ad | ||
|
|
c0b1833446 | ||
|
|
329a72c77c | ||
|
|
2610ee9d0c | ||
|
|
a13ca9034e | ||
|
|
5d1abb3689 | ||
|
|
e1d1c6d154 | ||
|
|
e18e0e7cd4 | ||
|
|
eaf3d07a3f | ||
|
|
c88ae32b7f | ||
|
|
605613e220 | ||
|
|
d2772000ec | ||
|
|
42939a79f5 | ||
|
|
ed17931117 | ||
|
|
66df5f7a1c | ||
|
|
fc6e6696e5 | ||
|
|
465748c8a1 | ||
|
|
e59cd71bbf | ||
|
|
8a76fea310 | ||
|
|
0e46be54ec | ||
|
|
dc81813fdf | ||
|
|
eaa0df16bb | ||
|
|
c23e911028 | ||
|
|
06b96a1007 | ||
|
|
fa545c591f | ||
|
|
e828b780c7 | ||
|
|
eca8c5cabd | ||
|
|
b7bce6008f | ||
|
|
2fdf89883d | ||
|
|
6c5d4bbaaa | ||
|
|
cb2f926d4f | ||
|
|
12c01b437e | ||
|
|
3253a58942 | ||
|
|
199f7f14ea | ||
|
|
d42406d765 | ||
|
|
2276ffb1f6 | ||
|
|
218fb3afb0 | ||
|
|
a9fb890979 | ||
|
|
54ebf5b455 | ||
|
|
c9a0475aa8 | ||
|
|
5567d9f88c | ||
|
|
56f3e661ae | ||
|
|
1aa4479a10 | ||
|
|
7b625d0a91 | ||
|
|
fd0529529d | ||
|
|
af43191954 | ||
|
|
2ce2ca7c91 | ||
|
|
a0fc3db665 | ||
|
|
feb458027f | ||
|
|
e5a5b7af5c | ||
|
|
ad456ae2fe | ||
|
|
690cb51f6c | ||
|
|
14aaa2f376 | ||
|
|
6e47ca2c41 | ||
|
|
0d99d2be9b | ||
|
|
c322ef00e7 | ||
|
|
3513421225 | ||
|
|
b0e6bfbefe | ||
|
|
f7a918730e | ||
|
|
cef33319c5 | ||
|
|
2036a59210 | ||
|
|
e5eccb6227 | ||
|
|
48c2c8567c | ||
|
|
bbeef0299f | ||
|
|
bec5584d63 | ||
|
|
bdc759d34c | ||
|
|
8db442d8ba | ||
|
|
9e7a0d4175 | ||
|
|
9c33b3f5a9 | ||
|
|
7e7e2c87dc | ||
|
|
2f741f35a8 | ||
|
|
c411466df7 | ||
|
|
9679939307 | ||
|
|
8539423b22 | ||
|
|
81edafdf09 | ||
|
|
e0a262882a | ||
|
|
89237ab99e | ||
|
|
0f414e451e | ||
|
|
1180522725 | ||
|
|
81c7ebf123 | ||
|
|
258f05e6f4 | ||
|
|
53efb1c153 | ||
|
|
26014a9705 | ||
|
|
00ef037e45 | ||
|
|
669ec74e67 | ||
|
|
c4528200b0 | ||
|
|
ba7cd0250a | ||
|
|
c5e97678a1 | ||
|
|
337a46cdcc | ||
|
|
7f74b67f1f | ||
|
|
5dcc48d2e5 | ||
|
|
8b04aab07d | ||
|
|
eab4f6cf2e | ||
|
|
7f8d623283 | ||
|
|
dbffed8f1f | ||
|
|
7e3688fdd0 | ||
|
|
2e111e9ad3 | ||
|
|
6d6070ff3f | ||
|
|
391bbde353 | ||
|
|
3c56eb3762 | ||
|
|
7c14ea354b | ||
|
|
c96aad0b77 | ||
|
|
a9dd3e424b | ||
|
|
8a144a4046 | ||
|
|
75f86d7267 | ||
|
|
bbf875fc2f | ||
|
|
59d491f61b | ||
|
|
ed640a1324 | ||
|
|
e86fbcaef7 | ||
|
|
7f48212054 | ||
|
|
a2c5c71baf | ||
|
|
b904f81cb9 | ||
|
|
d64fe374dd | ||
|
|
fe25e7938e | ||
|
|
931df361bf | ||
|
|
d7c45f4aee | ||
|
|
5e5bef581b | ||
|
|
2d9e95d812 | ||
|
|
e5f979d106 | ||
|
|
c7a5815203 | ||
|
|
03e268722e | ||
|
|
78a2774329 | ||
|
|
c1b5ab7f53 | ||
|
|
b861d97ad4 | ||
|
|
f3abcc9dd6 | ||
|
|
cab13fe018 | ||
|
|
cc4b19c7ce | ||
|
|
a754d9aee5 | ||
|
|
22b54b2d8d | ||
|
|
d12ca6301a | ||
|
|
bc1b2ad9ab | ||
|
|
1782ab1514 | ||
|
|
0384fc50e3 | ||
|
|
cc46dee9ee | ||
|
|
ed5a0ae45a | ||
|
|
928ccfefb8 | ||
|
|
7f6bfb7b3e | ||
|
|
bcbc9bf675 | ||
|
|
0ec4366f4c | ||
|
|
ff72b7eea1 | ||
|
|
a32ca19251 | ||
|
|
b79508956a | ||
|
|
d76c5bd658 | ||
|
|
580e11126c | ||
|
|
736d40546a | ||
|
|
88810d2bb5 | ||
|
|
3a8f4d2ffb | ||
|
|
1fe125a65f | ||
|
|
0ff4df0836 | ||
|
|
16b4775e2d | ||
|
|
c3a13b8a29 | ||
|
|
d1053375b7 | ||
|
|
0fa4538256 | ||
|
|
738644f288 | ||
|
|
2f80b055ac | ||
|
|
fd62a1df10 | ||
|
|
a85d0ebd0a | ||
|
|
2c06902baa | ||
|
|
76ac6429fe | ||
|
|
43cae66b0d | ||
|
|
dacddecc7d | ||
|
|
dcb9267c2f | ||
|
|
ff35fd90fa | ||
|
|
7469377079 | ||
|
|
c8441f8d38 | ||
|
|
abf4eb0ffc | ||
|
|
93717cc830 | ||
|
|
b629bc81f8 | ||
|
|
f628897fe1 | ||
|
|
54b82a78e3 | ||
|
|
377faf145f | ||
|
|
69e316948f | ||
|
|
62cbff4f53 | ||
|
|
5582265e9d | ||
|
|
fb5ea3c324 | ||
|
|
9b5f676f50 | ||
|
|
88cfc0fa7e | ||
|
|
665bfa2f13 | ||
|
|
b89b1a64f4 | ||
|
|
9ba657c261 | ||
|
|
bce958b8e6 | ||
|
|
914012de2b | ||
|
|
8d1c476aed | ||
|
|
567c729e9e | ||
|
|
3f03dd20e4 | ||
|
|
1c778354da | ||
|
|
3a149fa459 | ||
|
|
f3b121950d | ||
|
|
43c13b7ba1 | ||
|
|
9447b33800 | ||
|
|
2934752eeb | ||
|
|
dd6d8c71fd | ||
|
|
80267c389b | ||
|
|
acfbaf75d5 | ||
|
|
5f54377407 | ||
|
|
552aa64741 | ||
|
|
d64f611f51 | ||
|
|
a96cc92d77 | ||
|
|
3858cccc41 | ||
|
|
072828512a | ||
|
|
a73ffe5642 | ||
|
|
8e784a5b6d | ||
|
|
1b6f9332f1 | ||
|
|
db8b472729 | ||
|
|
867b371522 | ||
|
|
c0d7c9fc7d | ||
|
|
bb4685cf90 | ||
|
|
6a95426749 | ||
|
|
ef6af8e84d | ||
|
|
763130f253 | ||
|
|
1256c040e9 | ||
|
|
18b7b48a99 | ||
|
|
627c11503f | ||
|
|
712ba84f06 | ||
|
|
5186e029b3 | ||
|
|
5bfaedf903 | ||
|
|
5061da6897 | ||
|
|
c159a28016 | ||
|
|
82a1b1c921 | ||
|
|
bf2210d0f4 | ||
|
|
8f0772cb94 | ||
|
|
5b57079ecd | ||
|
|
350d759517 | ||
|
|
edd793c9f5 | ||
|
|
545c2dc685 | ||
|
|
84955c066c | ||
|
|
06dd03b170 | ||
|
|
47bc2ed2dc | ||
|
|
44281afc54 | ||
|
|
4d2859d145 | ||
|
|
45d44a1669 | ||
|
|
ddd83b340e | ||
|
|
ccdb54d7c3 | ||
|
|
bcc246d950 | ||
|
|
62139e252a | ||
|
|
86950c3a0a | ||
|
|
f4865ef68d | ||
|
|
ea7209e7ae | ||
|
|
998c551cf3 | ||
|
|
e6f29b0116 | ||
|
|
eb90bb39dc | ||
|
|
ad189b35ad | ||
|
|
7d2989a233 | ||
|
|
862137ae7d | ||
|
|
c86e082d9a | ||
|
|
80fe048f97 | ||
|
|
f2bffb3ce7 | ||
|
|
cbe2f9eef8 | ||
|
|
688f41f570 | ||
|
|
a29197637e | ||
|
|
7a2712a37f | ||
|
|
189f5cfd8c | ||
|
|
e509480892 | ||
|
|
7f7955351a | ||
|
|
46f1db21a8 | ||
|
|
fbe7bc6951 | ||
|
|
f658507847 | ||
|
|
374078683b | ||
|
|
114c4e0886 | ||
|
|
67c62766d4 | ||
|
|
3f2947158d | ||
|
|
278a7cb356 | ||
|
|
890158a79c | ||
|
|
4dc1602b77 | ||
|
|
bbba0abac9 | ||
|
|
d04fd807c6 | ||
|
|
3456df4cf1 | ||
|
|
f56aaa791e | ||
|
|
465a758770 | ||
|
|
0f7c0c1b2c | ||
|
|
bf8d10b6f6 | ||
|
|
20d04553d6 | ||
|
|
b56d62e3c4 | ||
|
|
9a332dcba1 | ||
|
|
166d9f8823 | ||
|
|
42f5eed75f | ||
|
|
01a7db18dd | ||
|
|
d4507465a3 | ||
|
|
3ac92ed10a | ||
|
|
43c76ca85c | ||
|
|
54d87fa96a | ||
|
|
f041f17268 | ||
|
|
31c80a6967 | ||
|
|
783ce136f4 | ||
|
|
f829145781 | ||
|
|
389337f8cd | ||
|
|
a0713c2d66 | ||
|
|
f94d3cbce4 | ||
|
|
8d8994b468 | ||
|
|
784a9097a5 | ||
|
|
b9601626e3 | ||
|
|
dc80b011f2 | ||
|
|
ee7d32d460 | ||
|
|
43fd9ee94e | ||
|
|
8821a91f3f | ||
|
|
98d9256f92 | ||
|
|
b35495eaa7 | ||
|
|
74d6b614b3 | ||
|
|
dd63c16a74 | ||
|
|
4280266a96 | ||
|
|
b1f02098ff | ||
|
|
95189b574a | ||
|
|
c5d23503bf | ||
|
|
77950f6069 | ||
|
|
ec5f2b3753 | ||
|
|
9e7104fb7f | ||
|
|
6b3b6ca45e | ||
|
|
20b8b0b24e | ||
|
|
4e11540458 | ||
|
|
ee87f2676d | ||
|
|
74a90aab98 | ||
|
|
48ff9a5100 | ||
|
|
3dfd578ee5 | ||
|
|
0db46cdc81 | ||
|
|
fdac58d031 | ||
|
|
df9d4ce856 | ||
|
|
e6ae4e97e8 | ||
|
|
10a4c28922 | ||
|
|
8a828c6e51 | ||
|
|
d7b40905ff | ||
|
|
f9a3b5f3cd | ||
|
|
b73b89242f | ||
|
|
23a0f6e8de | ||
|
|
87967abc3f | ||
|
|
ce60c286dc | ||
|
|
90fd9b0eb8 | ||
|
|
ca262a6797 | ||
|
|
c056d39775 | ||
|
|
1c4426ea4b | ||
|
|
36520bd7a1 | ||
|
|
badf0ace76 | ||
|
|
f1f61249e0 | ||
|
|
b371cac18c | ||
|
|
1846535d8d | ||
|
|
d7d9118b9b |
2
.env
@@ -123,7 +123,7 @@ SENTRY_ENVIRONMENT=local
|
||||
SENTRY_RELEASE=local
|
||||
|
||||
#### Prowler release version ####
|
||||
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.5.1
|
||||
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.6.0
|
||||
|
||||
# Social login credentials
|
||||
SOCIAL_GOOGLE_OAUTH_CALLBACK_URL="${AUTH_URL}/api/auth/callback/google"
|
||||
|
||||
1
.github/pull_request_template.md
vendored
@@ -16,6 +16,7 @@ Please include a summary of the change and which issue is fixed. List any depend
|
||||
- [ ] Review if code is being documented following this specification https://github.com/google/styleguide/blob/gh-pages/pyguide.md#38-comments-and-docstrings
|
||||
- [ ] Review if backport is needed.
|
||||
- [ ] Review if is needed to change the [Readme.md](https://github.com/prowler-cloud/prowler/blob/master/README.md)
|
||||
- [ ] Ensure new entries are added to [CHANGELOG.md](https://github.com/prowler-cloud/prowler/blob/master/prowler/CHANGELOG.md), if applicable.
|
||||
|
||||
#### API
|
||||
- [ ] Verify if API specs need to be regenerated.
|
||||
|
||||
@@ -81,7 +81,7 @@ jobs:
|
||||
- name: Build and push container image (latest)
|
||||
# Comment the following line for testing
|
||||
if: github.event_name == 'push'
|
||||
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
|
||||
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
|
||||
with:
|
||||
context: ${{ env.WORKING_DIRECTORY }}
|
||||
# Set push: false for testing
|
||||
@@ -94,7 +94,7 @@ jobs:
|
||||
|
||||
- name: Build and push container image (release)
|
||||
if: github.event_name == 'release'
|
||||
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
|
||||
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
|
||||
with:
|
||||
context: ${{ env.WORKING_DIRECTORY }}
|
||||
push: true
|
||||
|
||||
4
.github/workflows/api-codeql.yml
vendored
@@ -48,12 +48,12 @@ jobs:
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
|
||||
uses: github/codeql-action/init@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
|
||||
with:
|
||||
languages: ${{ matrix.language }}
|
||||
config-file: ./.github/codeql/api-codeql-config.yml
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
|
||||
uses: github/codeql-action/analyze@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
|
||||
with:
|
||||
category: "/language:${{matrix.language}}"
|
||||
|
||||
20
.github/workflows/api-pull-request.yml
vendored
@@ -85,14 +85,6 @@ jobs:
|
||||
api/README.md
|
||||
api/mkdocs.yml
|
||||
|
||||
- name: Replace @master with current branch in pyproject.toml
|
||||
working-directory: ./api
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
run: |
|
||||
BRANCH_NAME="${GITHUB_HEAD_REF:-${GITHUB_REF_NAME}}"
|
||||
echo "Using branch: $BRANCH_NAME"
|
||||
sed -i "s|@master|@$BRANCH_NAME|g" pyproject.toml
|
||||
|
||||
- name: Install poetry
|
||||
working-directory: ./api
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
@@ -100,15 +92,9 @@ jobs:
|
||||
python -m pip install --upgrade pip
|
||||
pipx install poetry==2.1.1
|
||||
|
||||
- name: Update poetry.lock after the branch name change
|
||||
working-directory: ./api
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
run: |
|
||||
poetry lock
|
||||
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
||||
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
cache: "poetry"
|
||||
@@ -159,7 +145,7 @@ jobs:
|
||||
working-directory: ./api
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
run: |
|
||||
poetry run safety check --ignore 70612,66963,74429
|
||||
poetry run safety check --ignore 70612,66963
|
||||
|
||||
- name: Vulture
|
||||
working-directory: ./api
|
||||
@@ -193,7 +179,7 @@ jobs:
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
|
||||
- name: Build Container
|
||||
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
|
||||
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
|
||||
with:
|
||||
context: ${{ env.API_WORKING_DIR }}
|
||||
push: false
|
||||
|
||||
2
.github/workflows/find-secrets.yml
vendored
@@ -11,7 +11,7 @@ jobs:
|
||||
with:
|
||||
fetch-depth: 0
|
||||
- name: TruffleHog OSS
|
||||
uses: trufflesecurity/trufflehog@b06f6d72a3791308bb7ba59c2b8cb7a083bd17e4 # v3.88.26
|
||||
uses: trufflesecurity/trufflehog@690e5c7aff8347c3885096f3962a0633d9129607 # v3.88.23
|
||||
with:
|
||||
path: ./
|
||||
base: ${{ github.event.repository.default_branch }}
|
||||
|
||||
5
.github/workflows/pull-request-merged.yml
vendored
@@ -12,8 +12,6 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
ref: ${{ github.event.pull_request.merge_commit_sha }}
|
||||
|
||||
- name: Set short git commit SHA
|
||||
id: vars
|
||||
@@ -32,6 +30,5 @@ jobs:
|
||||
"PROWLER_COMMIT_SHORT_SHA": "${{ env.SHORT_SHA }}",
|
||||
"PROWLER_PR_TITLE": "${{ github.event.pull_request.title }}",
|
||||
"PROWLER_PR_LABELS": ${{ toJson(github.event.pull_request.labels.*.name) }},
|
||||
"PROWLER_PR_BODY": ${{ toJson(github.event.pull_request.body) }},
|
||||
"PROWLER_PR_URL":${{ toJson(github.event.pull_request.html_url) }}
|
||||
"PROWLER_PR_BODY": ${{ toJson(github.event.pull_request.body) }}
|
||||
}'
|
||||
|
||||
@@ -62,7 +62,7 @@ jobs:
|
||||
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
|
||||
- name: Setup Python
|
||||
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
||||
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
|
||||
with:
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
|
||||
@@ -127,7 +127,7 @@ jobs:
|
||||
|
||||
- name: Build and push container image (latest)
|
||||
if: github.event_name == 'push'
|
||||
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
|
||||
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
|
||||
with:
|
||||
push: true
|
||||
tags: |
|
||||
@@ -140,7 +140,7 @@ jobs:
|
||||
|
||||
- name: Build and push container image (release)
|
||||
if: github.event_name == 'release'
|
||||
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
|
||||
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
|
||||
with:
|
||||
# Use local context to get changes
|
||||
# https://github.com/docker/build-push-action#path-context
|
||||
|
||||
145
.github/workflows/sdk-bump-version.yml
vendored
@@ -1,145 +0,0 @@
|
||||
name: SDK - Bump Version
|
||||
|
||||
on:
|
||||
release:
|
||||
types: [published]
|
||||
|
||||
|
||||
env:
|
||||
PROWLER_VERSION: ${{ github.event.release.tag_name }}
|
||||
BASE_BRANCH: master
|
||||
|
||||
jobs:
|
||||
bump-version:
|
||||
name: Bump Version
|
||||
if: github.repository == 'prowler-cloud/prowler'
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
|
||||
- name: Get Prowler version
|
||||
shell: bash
|
||||
run: |
|
||||
if [[ $PROWLER_VERSION =~ ^([0-9]+)\.([0-9]+)\.([0-9]+)$ ]]; then
|
||||
MAJOR_VERSION=${BASH_REMATCH[1]}
|
||||
MINOR_VERSION=${BASH_REMATCH[2]}
|
||||
FIX_VERSION=${BASH_REMATCH[3]}
|
||||
|
||||
# Export version components to GitHub environment
|
||||
echo "MAJOR_VERSION=${MAJOR_VERSION}" >> "${GITHUB_ENV}"
|
||||
echo "MINOR_VERSION=${MINOR_VERSION}" >> "${GITHUB_ENV}"
|
||||
echo "FIX_VERSION=${FIX_VERSION}" >> "${GITHUB_ENV}"
|
||||
|
||||
if (( MAJOR_VERSION == 5 )); then
|
||||
if (( FIX_VERSION == 0 )); then
|
||||
echo "Minor Release: $PROWLER_VERSION"
|
||||
|
||||
# Set up next minor version for master
|
||||
BUMP_VERSION_TO=${MAJOR_VERSION}.$((MINOR_VERSION + 1)).${FIX_VERSION}
|
||||
echo "BUMP_VERSION_TO=${BUMP_VERSION_TO}" >> "${GITHUB_ENV}"
|
||||
|
||||
TARGET_BRANCH=${BASE_BRANCH}
|
||||
echo "TARGET_BRANCH=${TARGET_BRANCH}" >> "${GITHUB_ENV}"
|
||||
|
||||
# Set up patch version for version branch
|
||||
PATCH_VERSION_TO=${MAJOR_VERSION}.${MINOR_VERSION}.1
|
||||
echo "PATCH_VERSION_TO=${PATCH_VERSION_TO}" >> "${GITHUB_ENV}"
|
||||
|
||||
VERSION_BRANCH=v${MAJOR_VERSION}.${MINOR_VERSION}
|
||||
echo "VERSION_BRANCH=${VERSION_BRANCH}" >> "${GITHUB_ENV}"
|
||||
|
||||
echo "Bumping to next minor version: ${BUMP_VERSION_TO} in branch ${TARGET_BRANCH}"
|
||||
echo "Bumping to next patch version: ${PATCH_VERSION_TO} in branch ${VERSION_BRANCH}"
|
||||
else
|
||||
echo "Patch Release: $PROWLER_VERSION"
|
||||
|
||||
BUMP_VERSION_TO=${MAJOR_VERSION}.${MINOR_VERSION}.$((FIX_VERSION + 1))
|
||||
echo "BUMP_VERSION_TO=${BUMP_VERSION_TO}" >> "${GITHUB_ENV}"
|
||||
|
||||
TARGET_BRANCH=v${MAJOR_VERSION}.${MINOR_VERSION}
|
||||
echo "TARGET_BRANCH=${TARGET_BRANCH}" >> "${GITHUB_ENV}"
|
||||
|
||||
echo "Bumping to next patch version: ${BUMP_VERSION_TO} in branch ${TARGET_BRANCH}"
|
||||
fi
|
||||
else
|
||||
echo "Releasing another Prowler major version, aborting..."
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
echo "Invalid version syntax: '$PROWLER_VERSION' (must be N.N.N)" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Bump versions in files
|
||||
run: |
|
||||
echo "Using PROWLER_VERSION=$PROWLER_VERSION"
|
||||
echo "Using BUMP_VERSION_TO=$BUMP_VERSION_TO"
|
||||
|
||||
set -e
|
||||
|
||||
echo "Bumping version in pyproject.toml ..."
|
||||
sed -i "s|version = \"${PROWLER_VERSION}\"|version = \"${BUMP_VERSION_TO}\"|" pyproject.toml
|
||||
|
||||
echo "Bumping version in prowler/config/config.py ..."
|
||||
sed -i "s|prowler_version = \"${PROWLER_VERSION}\"|prowler_version = \"${BUMP_VERSION_TO}\"|" prowler/config/config.py
|
||||
|
||||
echo "Bumping version in .env ..."
|
||||
sed -i "s|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${PROWLER_VERSION}|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${BUMP_VERSION_TO}|" .env
|
||||
|
||||
git --no-pager diff
|
||||
|
||||
- name: Create Pull Request
|
||||
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
|
||||
with:
|
||||
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
|
||||
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
|
||||
base: ${{ env.TARGET_BRANCH }}
|
||||
commit-message: "chore(release): Bump version to v${{ env.BUMP_VERSION_TO }}"
|
||||
branch: "version-bump-to-v${{ env.BUMP_VERSION_TO }}"
|
||||
title: "chore(release): Bump version to v${{ env.BUMP_VERSION_TO }}"
|
||||
body: |
|
||||
### Description
|
||||
|
||||
Bump Prowler version to v${{ env.BUMP_VERSION_TO }}
|
||||
|
||||
### License
|
||||
|
||||
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
|
||||
|
||||
- name: Handle patch version for minor release
|
||||
if: env.FIX_VERSION == '0'
|
||||
run: |
|
||||
echo "Using PROWLER_VERSION=$PROWLER_VERSION"
|
||||
echo "Using PATCH_VERSION_TO=$PATCH_VERSION_TO"
|
||||
|
||||
set -e
|
||||
|
||||
echo "Bumping version in pyproject.toml ..."
|
||||
sed -i "s|version = \"${PROWLER_VERSION}\"|version = \"${PATCH_VERSION_TO}\"|" pyproject.toml
|
||||
|
||||
echo "Bumping version in prowler/config/config.py ..."
|
||||
sed -i "s|prowler_version = \"${PROWLER_VERSION}\"|prowler_version = \"${PATCH_VERSION_TO}\"|" prowler/config/config.py
|
||||
|
||||
echo "Bumping version in .env ..."
|
||||
sed -i "s|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${PROWLER_VERSION}|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${PATCH_VERSION_TO}|" .env
|
||||
|
||||
git --no-pager diff
|
||||
|
||||
- name: Create Pull Request for patch version
|
||||
if: env.FIX_VERSION == '0'
|
||||
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
|
||||
with:
|
||||
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
|
||||
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
|
||||
base: ${{ env.VERSION_BRANCH }}
|
||||
commit-message: "chore(release): Bump version to v${{ env.PATCH_VERSION_TO }}"
|
||||
branch: "version-bump-to-v${{ env.PATCH_VERSION_TO }}"
|
||||
title: "chore(release): Bump version to v${{ env.PATCH_VERSION_TO }}"
|
||||
body: |
|
||||
### Description
|
||||
|
||||
Bump Prowler version to v${{ env.PATCH_VERSION_TO }}
|
||||
|
||||
### License
|
||||
|
||||
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
|
||||
4
.github/workflows/sdk-codeql.yml
vendored
@@ -56,12 +56,12 @@ jobs:
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
|
||||
uses: github/codeql-action/init@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
|
||||
with:
|
||||
languages: ${{ matrix.language }}
|
||||
config-file: ./.github/codeql/sdk-codeql-config.yml
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
|
||||
uses: github/codeql-action/analyze@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
|
||||
with:
|
||||
category: "/language:${{matrix.language}}"
|
||||
|
||||
119
.github/workflows/sdk-pull-request.yml
vendored
@@ -51,7 +51,7 @@ jobs:
|
||||
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
||||
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
cache: "poetry"
|
||||
@@ -107,123 +107,11 @@ jobs:
|
||||
run: |
|
||||
/tmp/hadolint Dockerfile --ignore=DL3013
|
||||
|
||||
# Test AWS
|
||||
- name: AWS - Check if any file has changed
|
||||
id: aws-changed-files
|
||||
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
|
||||
with:
|
||||
files: |
|
||||
./prowler/providers/aws/**
|
||||
./tests/providers/aws/**
|
||||
.poetry.lock
|
||||
|
||||
- name: AWS - Test
|
||||
if: steps.aws-changed-files.outputs.any_changed == 'true'
|
||||
run: |
|
||||
poetry run pytest -n auto --cov=./prowler/providers/aws --cov-report=xml:aws_coverage.xml tests/providers/aws
|
||||
|
||||
# Test Azure
|
||||
- name: Azure - Check if any file has changed
|
||||
id: azure-changed-files
|
||||
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
|
||||
with:
|
||||
files: |
|
||||
./prowler/providers/azure/**
|
||||
./tests/providers/azure/**
|
||||
.poetry.lock
|
||||
|
||||
- name: Azure - Test
|
||||
if: steps.azure-changed-files.outputs.any_changed == 'true'
|
||||
run: |
|
||||
poetry run pytest -n auto --cov=./prowler/providers/azure --cov-report=xml:azure_coverage.xml tests/providers/azure
|
||||
|
||||
# Test GCP
|
||||
- name: GCP - Check if any file has changed
|
||||
id: gcp-changed-files
|
||||
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
|
||||
with:
|
||||
files: |
|
||||
./prowler/providers/gcp/**
|
||||
./tests/providers/gcp/**
|
||||
.poetry.lock
|
||||
|
||||
- name: GCP - Test
|
||||
if: steps.gcp-changed-files.outputs.any_changed == 'true'
|
||||
run: |
|
||||
poetry run pytest -n auto --cov=./prowler/providers/gcp --cov-report=xml:gcp_coverage.xml tests/providers/gcp
|
||||
|
||||
# Test Kubernetes
|
||||
- name: Kubernetes - Check if any file has changed
|
||||
id: kubernetes-changed-files
|
||||
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
|
||||
with:
|
||||
files: |
|
||||
./prowler/providers/kubernetes/**
|
||||
./tests/providers/kubernetes/**
|
||||
.poetry.lock
|
||||
|
||||
- name: Kubernetes - Test
|
||||
if: steps.kubernetes-changed-files.outputs.any_changed == 'true'
|
||||
run: |
|
||||
poetry run pytest -n auto --cov=./prowler/providers/kubernetes --cov-report=xml:kubernetes_coverage.xml tests/providers/kubernetes
|
||||
|
||||
# Test GitHub
|
||||
- name: GitHub - Check if any file has changed
|
||||
id: github-changed-files
|
||||
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
|
||||
with:
|
||||
files: |
|
||||
./prowler/providers/github/**
|
||||
./tests/providers/github/**
|
||||
.poetry.lock
|
||||
|
||||
- name: GitHub - Test
|
||||
if: steps.github-changed-files.outputs.any_changed == 'true'
|
||||
run: |
|
||||
poetry run pytest -n auto --cov=./prowler/providers/github --cov-report=xml:github_coverage.xml tests/providers/github
|
||||
|
||||
# Test NHN
|
||||
- name: NHN - Check if any file has changed
|
||||
id: nhn-changed-files
|
||||
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
|
||||
with:
|
||||
files: |
|
||||
./prowler/providers/nhn/**
|
||||
./tests/providers/nhn/**
|
||||
.poetry.lock
|
||||
|
||||
- name: NHN - Test
|
||||
if: steps.nhn-changed-files.outputs.any_changed == 'true'
|
||||
run: |
|
||||
poetry run pytest -n auto --cov=./prowler/providers/nhn --cov-report=xml:nhn_coverage.xml tests/providers/nhn
|
||||
|
||||
# Test M365
|
||||
- name: M365 - Check if any file has changed
|
||||
id: m365-changed-files
|
||||
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
|
||||
with:
|
||||
files: |
|
||||
./prowler/providers/m365/**
|
||||
./tests/providers/m365/**
|
||||
.poetry.lock
|
||||
|
||||
- name: M365 - Test
|
||||
if: steps.m365-changed-files.outputs.any_changed == 'true'
|
||||
run: |
|
||||
poetry run pytest -n auto --cov=./prowler/providers/m365 --cov-report=xml:m365_coverage.xml tests/providers/m365
|
||||
|
||||
# Common Tests
|
||||
- name: Lib - Test
|
||||
- name: Test with pytest
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
run: |
|
||||
poetry run pytest -n auto --cov=./prowler/lib --cov-report=xml:lib_coverage.xml tests/lib
|
||||
poetry run pytest -n auto --cov=./prowler --cov-report=xml tests
|
||||
|
||||
- name: Config - Test
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
run: |
|
||||
poetry run pytest -n auto --cov=./prowler/config --cov-report=xml:config_coverage.xml tests/config
|
||||
|
||||
# Codecov
|
||||
- name: Upload coverage reports to Codecov
|
||||
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
|
||||
uses: codecov/codecov-action@ad3126e916f78f00edff4ed0317cf185271ccc2d # v5.4.2
|
||||
@@ -231,4 +119,3 @@ jobs:
|
||||
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
|
||||
with:
|
||||
flags: prowler
|
||||
files: ./aws_coverage.xml,./azure_coverage.xml,./gcp_coverage.xml,./kubernetes_coverage.xml,./github_coverage.xml,./nhn_coverage.xml,./m365_coverage.xml,./lib_coverage.xml,./config_coverage.xml
|
||||
|
||||
2
.github/workflows/sdk-pypi-release.yml
vendored
@@ -71,7 +71,7 @@ jobs:
|
||||
pipx install poetry==2.1.1
|
||||
|
||||
- name: Setup Python
|
||||
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
||||
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
|
||||
with:
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
# cache: ${{ env.CACHE }}
|
||||
|
||||
@@ -28,7 +28,7 @@ jobs:
|
||||
ref: ${{ env.GITHUB_BRANCH }}
|
||||
|
||||
- name: setup python
|
||||
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
||||
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
|
||||
with:
|
||||
python-version: 3.9 #install the python needed
|
||||
|
||||
|
||||
@@ -81,7 +81,7 @@ jobs:
|
||||
- name: Build and push container image (latest)
|
||||
# Comment the following line for testing
|
||||
if: github.event_name == 'push'
|
||||
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
|
||||
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
|
||||
with:
|
||||
context: ${{ env.WORKING_DIRECTORY }}
|
||||
build-args: |
|
||||
@@ -96,7 +96,7 @@ jobs:
|
||||
|
||||
- name: Build and push container image (release)
|
||||
if: github.event_name == 'release'
|
||||
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
|
||||
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
|
||||
with:
|
||||
context: ${{ env.WORKING_DIRECTORY }}
|
||||
build-args: |
|
||||
|
||||
4
.github/workflows/ui-codeql.yml
vendored
@@ -48,12 +48,12 @@ jobs:
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
|
||||
uses: github/codeql-action/init@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
|
||||
with:
|
||||
languages: ${{ matrix.language }}
|
||||
config-file: ./.github/codeql/ui-codeql-config.yml
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
|
||||
uses: github/codeql-action/analyze@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
|
||||
with:
|
||||
category: "/language:${{matrix.language}}"
|
||||
|
||||
2
.github/workflows/ui-pull-request.yml
vendored
@@ -50,7 +50,7 @@ jobs:
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
|
||||
- name: Build Container
|
||||
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
|
||||
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
|
||||
with:
|
||||
context: ${{ env.UI_WORKING_DIR }}
|
||||
# Always build using `prod` target
|
||||
|
||||
3
.gitignore
vendored
@@ -42,9 +42,6 @@ junit-reports/
|
||||
# VSCode files
|
||||
.vscode/
|
||||
|
||||
# Cursor files
|
||||
.cursorignore
|
||||
|
||||
# Terraform
|
||||
.terraform*
|
||||
*.tfstate
|
||||
|
||||
@@ -115,7 +115,7 @@ repos:
|
||||
- id: safety
|
||||
name: safety
|
||||
description: "Safety is a tool that checks your installed dependencies for known security vulnerabilities"
|
||||
entry: bash -c 'safety check --ignore 70612,66963,74429'
|
||||
entry: bash -c 'safety check --ignore 70612,66963'
|
||||
language: system
|
||||
|
||||
- id: vulture
|
||||
|
||||
42
Dockerfile
@@ -1,43 +1,24 @@
|
||||
FROM python:3.12.10-slim-bookworm AS build
|
||||
FROM python:3.12.10-alpine3.20
|
||||
|
||||
LABEL maintainer="https://github.com/prowler-cloud/prowler"
|
||||
LABEL org.opencontainers.image.source="https://github.com/prowler-cloud/prowler"
|
||||
|
||||
ARG POWERSHELL_VERSION=7.5.0
|
||||
|
||||
# hadolint ignore=DL3008
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends wget libicu72 \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Install PowerShell
|
||||
RUN ARCH=$(uname -m) && \
|
||||
if [ "$ARCH" = "x86_64" ]; then \
|
||||
wget --progress=dot:giga https://github.com/PowerShell/PowerShell/releases/download/v${POWERSHELL_VERSION}/powershell-${POWERSHELL_VERSION}-linux-x64.tar.gz -O /tmp/powershell.tar.gz ; \
|
||||
elif [ "$ARCH" = "aarch64" ]; then \
|
||||
wget --progress=dot:giga https://github.com/PowerShell/PowerShell/releases/download/v${POWERSHELL_VERSION}/powershell-${POWERSHELL_VERSION}-linux-arm64.tar.gz -O /tmp/powershell.tar.gz ; \
|
||||
else \
|
||||
echo "Unsupported architecture: $ARCH" && exit 1 ; \
|
||||
fi && \
|
||||
mkdir -p /opt/microsoft/powershell/7 && \
|
||||
tar zxf /tmp/powershell.tar.gz -C /opt/microsoft/powershell/7 && \
|
||||
chmod +x /opt/microsoft/powershell/7/pwsh && \
|
||||
ln -s /opt/microsoft/powershell/7/pwsh /usr/bin/pwsh && \
|
||||
rm /tmp/powershell.tar.gz
|
||||
|
||||
# Add prowler user
|
||||
RUN addgroup --gid 1000 prowler && \
|
||||
adduser --uid 1000 --gid 1000 --disabled-password --gecos "" prowler
|
||||
# Update system dependencies and install essential tools
|
||||
#hadolint ignore=DL3018
|
||||
RUN apk --no-cache upgrade && apk --no-cache add curl git gcc python3-dev musl-dev linux-headers
|
||||
|
||||
# Create non-root user
|
||||
RUN mkdir -p /home/prowler && \
|
||||
echo 'prowler:x:1000:1000:prowler:/home/prowler:' > /etc/passwd && \
|
||||
echo 'prowler:x:1000:' > /etc/group && \
|
||||
chown -R prowler:prowler /home/prowler
|
||||
USER prowler
|
||||
|
||||
WORKDIR /home/prowler
|
||||
|
||||
# Copy necessary files
|
||||
WORKDIR /home/prowler
|
||||
COPY prowler/ /home/prowler/prowler/
|
||||
COPY dashboard/ /home/prowler/dashboard/
|
||||
COPY pyproject.toml /home/prowler
|
||||
COPY README.md /home/prowler/
|
||||
COPY prowler/providers/m365/lib/powershell/m365_powershell.py /home/prowler/prowler/providers/m365/lib/powershell/m365_powershell.py
|
||||
|
||||
# Install Python dependencies
|
||||
ENV HOME='/home/prowler'
|
||||
@@ -53,9 +34,6 @@ RUN pip install --no-cache-dir --upgrade pip && \
|
||||
RUN poetry install --compile && \
|
||||
rm -rf ~/.cache/pip
|
||||
|
||||
# Install PowerShell modules
|
||||
RUN poetry run python prowler/providers/m365/lib/powershell/m365_powershell.py
|
||||
|
||||
# Remove deprecated dash dependencies
|
||||
RUN pip uninstall dash-html-components -y && \
|
||||
pip uninstall dash-core-components -y
|
||||
|
||||
157
README.md
@@ -3,7 +3,7 @@
|
||||
<img align="center" src="https://github.com/prowler-cloud/prowler/blob/master/docs/img/prowler-logo-white.png#gh-dark-mode-only" width="50%" height="50%">
|
||||
</p>
|
||||
<p align="center">
|
||||
<b><i>Prowler Open Source</b> is as dynamic and adaptable as the environment it secures. It is trusted by the industry leaders to uphold the highest standards in security.
|
||||
<b><i>Prowler Open Source</b> is as dynamic and adaptable as the environment they’re meant to protect. Trusted by the leaders in security.
|
||||
</p>
|
||||
<p align="center">
|
||||
<b>Learn more at <a href="https://prowler.com">prowler.com</i></b>
|
||||
@@ -43,29 +43,15 @@
|
||||
|
||||
# Description
|
||||
|
||||
**Prowler** is an open-source security tool designed to assess and enforce security best practices across AWS, Azure, Google Cloud, and Kubernetes. It supports tasks such as security audits, incident response, continuous monitoring, system hardening, forensic readiness, and remediation processes.
|
||||
|
||||
Prowler includes hundreds of built-in controls to ensure compliance with standards and frameworks, including:
|
||||
|
||||
- **Industry Standards:** CIS, NIST 800, NIST CSF, and CISA
|
||||
- **Regulatory Compliance and Governance:** RBI, FedRAMP, and PCI-DSS
|
||||
- **Frameworks for Sensitive Data and Privacy:** GDPR, HIPAA, and FFIEC
|
||||
- **Frameworks for Organizational Governance and Quality Control:** SOC2 and GXP
|
||||
- **AWS-Specific Frameworks:** AWS Foundational Technical Review (FTR) and AWS Well-Architected Framework (Security Pillar)
|
||||
- **National Security Standards:** ENS (Spanish National Security Scheme)
|
||||
- **Custom Security Frameworks:** Tailored to your needs
|
||||
|
||||
## Prowler CLI and Prowler Cloud
|
||||
|
||||
Prowler offers a Command Line Interface (CLI), known as Prowler Open Source, and an additional service built on top of it, called <a href="https://prowler.com">Prowler Cloud</a>.
|
||||
**Prowler** is an Open Source security tool to perform AWS, Azure, Google Cloud and Kubernetes security best practices assessments, audits, incident response, continuous monitoring, hardening and forensics readiness, and also remediations! We have Prowler CLI (Command Line Interface) that we call Prowler Open Source and a service on top of it that we call <a href="https://prowler.com">Prowler Cloud</a>.
|
||||
|
||||
## Prowler App
|
||||
|
||||
Prowler App is a web-based application that simplifies running Prowler across your cloud provider accounts. It provides a user-friendly interface to visualize the results and streamline your security assessments.
|
||||
Prowler App is a web application that allows you to run Prowler in your cloud provider accounts and visualize the results in a user-friendly interface.
|
||||
|
||||

|
||||
|
||||
>For more details, refer to the [Prowler App Documentation](https://docs.prowler.com/projects/prowler-open-source/en/latest/#prowler-app-installation)
|
||||
>More details at [Prowler App Documentation](https://docs.prowler.com/projects/prowler-open-source/en/latest/#prowler-app-installation)
|
||||
|
||||
## Prowler CLI
|
||||
|
||||
@@ -74,7 +60,6 @@ prowler <provider>
|
||||
```
|
||||

|
||||
|
||||
|
||||
## Prowler Dashboard
|
||||
|
||||
```console
|
||||
@@ -82,28 +67,26 @@ prowler dashboard
|
||||
```
|
||||

|
||||
|
||||
# Prowler at a Glance
|
||||
It contains hundreds of controls covering CIS, NIST 800, NIST CSF, CISA, RBI, FedRAMP, PCI-DSS, GDPR, HIPAA, FFIEC, SOC2, GXP, AWS Well-Architected Framework Security Pillar, AWS Foundational Technical Review (FTR), ENS (Spanish National Security Scheme) and your custom security frameworks.
|
||||
|
||||
| Provider | Checks | Services | [Compliance Frameworks](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/compliance/) | [Categories](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/misc/#categories) |
|
||||
|---|---|---|---|---|
|
||||
| AWS | 564 | 82 | 34 | 10 |
|
||||
| AWS | 564 | 82 | 33 | 10 |
|
||||
| GCP | 79 | 13 | 7 | 3 |
|
||||
| Azure | 140 | 18 | 8 | 3 |
|
||||
| Kubernetes | 83 | 7 | 4 | 7 |
|
||||
| GitHub | 3 | 2 | 1 | 0 |
|
||||
| M365 | 44 | 2 | 2 | 0 |
|
||||
| M365 | 5 | 2 | 1 | 0 |
|
||||
| NHN (Unofficial) | 6 | 2 | 1 | 0 |
|
||||
|
||||
> Use the following commands to list Prowler's available checks, services, compliance frameworks, and categories: `prowler <provider> --list-checks`, `prowler <provider> --list-services`, `prowler <provider> --list-compliance` and `prowler <provider> --list-categories`.
|
||||
> You can list the checks, services, compliance frameworks and categories with `prowler <provider> --list-checks`, `prowler <provider> --list-services`, `prowler <provider> --list-compliance` and `prowler <provider> --list-categories`.
|
||||
|
||||
# 💻 Installation
|
||||
|
||||
## Prowler App
|
||||
|
||||
Installing Prowler App
|
||||
Prowler App offers flexible installation methods tailored to various environments:
|
||||
Prowler App can be installed in different ways, depending on your environment:
|
||||
|
||||
> For detailed instructions on using Prowler App, refer to the [Prowler App Usage Guide](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/prowler-app/).
|
||||
> See how to use Prowler App in the [Prowler App Usage Guide](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/prowler-app/).
|
||||
|
||||
### Docker Compose
|
||||
|
||||
@@ -119,16 +102,8 @@ curl -LO https://raw.githubusercontent.com/prowler-cloud/prowler/refs/heads/mast
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
> Containers are built for `linux/amd64`.
|
||||
|
||||
### Configuring Your Workstation for Prowler App
|
||||
|
||||
If your workstation's architecture is incompatible, you can resolve this by:
|
||||
|
||||
- **Setting the environment variable**: `DOCKER_DEFAULT_PLATFORM=linux/amd64`
|
||||
- **Using the following flag in your Docker command**: `--platform linux/amd64`
|
||||
|
||||
> Once configured, access the Prowler App at http://localhost:3000. Sign up using your email and password to get started.
|
||||
> Containers are built for `linux/amd64`. If your workstation's architecture is different, please set `DOCKER_DEFAULT_PLATFORM=linux/amd64` in your environment or use the `--platform linux/amd64` flag in the docker command.
|
||||
> Enjoy Prowler App at http://localhost:3000 by signing up with your email and password.
|
||||
|
||||
### From GitHub
|
||||
|
||||
@@ -154,12 +129,12 @@ python manage.py migrate --database admin
|
||||
gunicorn -c config/guniconf.py config.wsgi:application
|
||||
```
|
||||
> [!IMPORTANT]
|
||||
> As of Poetry v2.0.0, the `poetry shell` command has been deprecated. Use `poetry env activate` instead for environment activation.
|
||||
> Starting from Poetry v2.0.0, `poetry shell` has been deprecated in favor of `poetry env activate`.
|
||||
>
|
||||
> If your Poetry version is below v2.0.0, continue using `poetry shell` to activate your environment.
|
||||
> For further guidance, refer to the Poetry Environment Activation Guide https://python-poetry.org/docs/managing-environments/#activating-the-environment.
|
||||
> If your poetry version is below 2.0.0 you must keep using `poetry shell` to activate your environment.
|
||||
> In case you have any doubts, consult the Poetry environment activation guide: https://python-poetry.org/docs/managing-environments/#activating-the-environment
|
||||
|
||||
> After completing the setup, access the API documentation at http://localhost:8080/api/v1/docs.
|
||||
> Now, you can access the API documentation at http://localhost:8080/api/v1/docs.
|
||||
|
||||
**Commands to run the API Worker**
|
||||
|
||||
@@ -197,31 +172,29 @@ npm run build
|
||||
npm start
|
||||
```
|
||||
|
||||
> Once configured, access the Prowler App at http://localhost:3000. Sign up using your email and password to get started.
|
||||
> Enjoy Prowler App at http://localhost:3000 by signing up with your email and password.
|
||||
|
||||
## Prowler CLI
|
||||
### Pip package
|
||||
Prowler CLI is available as a project in [PyPI](https://pypi.org/project/prowler-cloud/). Consequently, it can be installed using pip with Python >3.9.1, <3.13:
|
||||
Prowler CLI is available as a project in [PyPI](https://pypi.org/project/prowler-cloud/), thus can be installed using pip with Python > 3.9.1, < 3.13:
|
||||
|
||||
```console
|
||||
pip install prowler
|
||||
prowler -v
|
||||
```
|
||||
>For further guidance, refer to [https://docs.prowler.com](https://docs.prowler.com/projects/prowler-open-source/en/latest/#prowler-cli-installation)
|
||||
>More details at [https://docs.prowler.com](https://docs.prowler.com/projects/prowler-open-source/en/latest/#prowler-cli-installation)
|
||||
|
||||
### Containers
|
||||
|
||||
**Available Versions of Prowler CLI**
|
||||
The available versions of Prowler CLI are the following:
|
||||
|
||||
The following versions of Prowler CLI are available, depending on your requirements:
|
||||
|
||||
- `latest`: Synchronizes with the `master` branch. Note that this version is not stable.
|
||||
- `v4-latest`: Synchronizes with the `v4` branch. Note that this version is not stable.
|
||||
- `v3-latest`: Synchronizes with the `v3` branch. Note that this version is not stable.
|
||||
- `<x.y.z>` (release): Stable releases corresponding to specific versions. You can find the complete list of releases [here](https://github.com/prowler-cloud/prowler/releases).
|
||||
- `stable`: Always points to the latest release.
|
||||
- `v4-stable`: Always points to the latest release for v4.
|
||||
- `v3-stable`: Always points to the latest release for v3.
|
||||
- `latest`: in sync with `master` branch (bear in mind that it is not a stable version)
|
||||
- `v4-latest`: in sync with `v4` branch (bear in mind that it is not a stable version)
|
||||
- `v3-latest`: in sync with `v3` branch (bear in mind that it is not a stable version)
|
||||
- `<x.y.z>` (release): you can find the releases [here](https://github.com/prowler-cloud/prowler/releases), those are stable releases.
|
||||
- `stable`: this tag always point to the latest release.
|
||||
- `v4-stable`: this tag always point to the latest release for v4.
|
||||
- `v3-stable`: this tag always point to the latest release for v3.
|
||||
|
||||
The container images are available here:
|
||||
- Prowler CLI:
|
||||
@@ -233,7 +206,7 @@ The container images are available here:
|
||||
|
||||
### From GitHub
|
||||
|
||||
Python >3.9.1, <3.13 is required with pip and Poetry:
|
||||
Python > 3.9.1, < 3.13 is required with pip and poetry:
|
||||
|
||||
``` console
|
||||
git clone https://github.com/prowler-cloud/prowler
|
||||
@@ -243,46 +216,25 @@ poetry install
|
||||
python prowler-cli.py -v
|
||||
```
|
||||
> [!IMPORTANT]
|
||||
> To clone Prowler on Windows, configure Git to support long file paths by running the following command: `git config core.longpaths true`.
|
||||
|
||||
> [!IMPORTANT]
|
||||
> As of Poetry v2.0.0, the `poetry shell` command has been deprecated. Use `poetry env activate` instead for environment activation.
|
||||
> Starting from Poetry v2.0.0, `poetry shell` has been deprecated in favor of `poetry env activate`.
|
||||
>
|
||||
> If your Poetry version is below v2.0.0, continue using `poetry shell` to activate your environment.
|
||||
> For further guidance, refer to the Poetry Environment Activation Guide https://python-poetry.org/docs/managing-environments/#activating-the-environment.
|
||||
> If your poetry version is below 2.0.0 you must keep using `poetry shell` to activate your environment.
|
||||
> In case you have any doubts, consult the Poetry environment activation guide: https://python-poetry.org/docs/managing-environments/#activating-the-environment
|
||||
|
||||
# ✏️ High level architecture
|
||||
> If you want to clone Prowler from Windows, use `git config core.longpaths true` to allow long file paths.
|
||||
# 📐✏️ High level architecture
|
||||
|
||||
## Prowler App
|
||||
**Prowler App** is composed of three key components:
|
||||
The **Prowler App** consists of three main components:
|
||||
|
||||
- **Prowler UI**: A web-based interface, built with Next.js, providing a user-friendly experience for executing Prowler scans and visualizing results.
|
||||
- **Prowler API**: A backend service, developed with Django REST Framework, responsible for running Prowler scans and storing the generated results.
|
||||
- **Prowler SDK**: A Python SDK designed to extend the functionality of the Prowler CLI for advanced capabilities.
|
||||
- **Prowler UI**: A user-friendly web interface for running Prowler and viewing results, powered by Next.js.
|
||||
- **Prowler API**: The backend API that executes Prowler scans and stores the results, built with Django REST Framework.
|
||||
- **Prowler SDK**: A Python SDK that integrates with the Prowler CLI for advanced functionality.
|
||||
|
||||

|
||||
|
||||
## Prowler CLI
|
||||
|
||||
**Running Prowler**
|
||||
|
||||
Prowler can be executed across various environments, offering flexibility to meet your needs. It can be run from:
|
||||
|
||||
- Your own workstation
|
||||
|
||||
- A Kubernetes Job
|
||||
|
||||
- Google Compute Engine
|
||||
|
||||
- Azure Virtual Machines (VMs)
|
||||
|
||||
- Amazon EC2 instances
|
||||
|
||||
- AWS Fargate or other container platforms
|
||||
|
||||
- CloudShell
|
||||
|
||||
And many more environments.
|
||||
You can run Prowler from your workstation, a Kubernetes Job, a Google Compute Engine, an Azure VM, an EC2 instance, Fargate or any other container, CloudShell and many more.
|
||||
|
||||

|
||||
|
||||
@@ -290,36 +242,23 @@ And many more environments.
|
||||
|
||||
## General
|
||||
- `Allowlist` now is called `Mutelist`.
|
||||
- The `--quiet` option has been deprecated. Use the `--status` flag to filter findings based on their status: PASS, FAIL, or MANUAL.
|
||||
- All findings with an `INFO` status have been reclassified as `MANUAL`.
|
||||
- The CSV output format is standardized across all providers.
|
||||
- The `--quiet` option has been deprecated, now use the `--status` flag to select the finding's status you want to get from PASS, FAIL or MANUAL.
|
||||
- All `INFO` finding's status has changed to `MANUAL`.
|
||||
- The CSV output format is common for all the providers.
|
||||
|
||||
**Deprecated Output Formats**
|
||||
|
||||
The following formats are now deprecated:
|
||||
- Native JSON has been replaced with JSON in [OCSF] v1.1.0 format, which is standardized across all providers (https://schema.ocsf.io/).
|
||||
We have deprecated some of our outputs formats:
|
||||
- The native JSON is replaced for the JSON [OCSF](https://schema.ocsf.io/) v1.1.0, common for all the providers.
|
||||
|
||||
## AWS
|
||||
|
||||
**AWS Flag Deprecation**
|
||||
|
||||
The flag --sts-endpoint-region has been deprecated due to the adoption of AWS STS regional tokens.
|
||||
|
||||
**Sending FAIL Results to AWS Security Hub**
|
||||
|
||||
- To send only FAILS to AWS Security Hub, use one of the following options: `--send-sh-only-fails` or `--security-hub --status FAIL`.
|
||||
- Deprecate the AWS flag --sts-endpoint-region since we use AWS STS regional tokens.
|
||||
- To send only FAILS to AWS Security Hub, now use either `--send-sh-only-fails` or `--security-hub --status FAIL`.
|
||||
|
||||
|
||||
# 📖 Documentation
|
||||
|
||||
**Documentation Resources**
|
||||
|
||||
For installation instructions, usage details, tutorials, and the Developer Guide, visit https://docs.prowler.com/
|
||||
Install, Usage, Tutorials and Developer Guide is at https://docs.prowler.com/
|
||||
|
||||
# 📃 License
|
||||
|
||||
**Prowler License Information**
|
||||
|
||||
Prowler is licensed under the Apache License 2.0, as indicated in each file within the repository. Obtaining a Copy of the License
|
||||
|
||||
A copy of the License is available at <http://www.apache.org/licenses/LICENSE-2.0>
|
||||
Prowler is licensed as Apache License 2.0 as specified in each file. You may obtain a copy of the License at
|
||||
<http://www.apache.org/licenses/LICENSE-2.0>
|
||||
|
||||
@@ -80,7 +80,7 @@ repos:
|
||||
- id: safety
|
||||
name: safety
|
||||
description: "Safety is a tool that checks your installed dependencies for known security vulnerabilities"
|
||||
entry: bash -c 'poetry run safety check --ignore 70612,66963,74429'
|
||||
entry: bash -c 'poetry run safety check --ignore 70612,66963'
|
||||
language: system
|
||||
|
||||
- id: vulture
|
||||
|
||||
@@ -2,31 +2,12 @@
|
||||
|
||||
All notable changes to the **Prowler API** are documented in this file.
|
||||
|
||||
## [v1.8.1] (Prowler v5.7.1)
|
||||
|
||||
### Fixed
|
||||
- Added database index to improve performance on finding lookup. [(#7800)](https://github.com/prowler-cloud/prowler/pull/7800)
|
||||
|
||||
---
|
||||
|
||||
## [v1.8.0] (Prowler v5.7.0)
|
||||
|
||||
### Added
|
||||
- Added huge improvements to `/findings/metadata` and resource related filters for findings [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690).
|
||||
- Added improvements to `/overviews` endpoints [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690).
|
||||
- Added new queue to perform backfill background tasks [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690).
|
||||
- Added new endpoints to retrieve latest findings and metadata [(#7743)](https://github.com/prowler-cloud/prowler/pull/7743).
|
||||
- Added export support for Prowler ThreatScore in M365 [(7783)](https://github.com/prowler-cloud/prowler/pull/7783)
|
||||
|
||||
---
|
||||
|
||||
## [v1.7.0] (Prowler v5.6.0)
|
||||
## [v1.7.0] (UNRELEASED)
|
||||
|
||||
### Added
|
||||
|
||||
- Added M365 as a new provider [(#7563)](https://github.com/prowler-cloud/prowler/pull/7563).
|
||||
- Added a `compliance/` folder and ZIP‐export functionality for all compliance reports.[(#7653)](https://github.com/prowler-cloud/prowler/pull/7653).
|
||||
- Added a new API endpoint to fetch and download any specific compliance file by name [(#7653)](https://github.com/prowler-cloud/prowler/pull/7653).
|
||||
|
||||
---
|
||||
|
||||
@@ -72,10 +53,6 @@ All notable changes to the **Prowler API** are documented in this file.
|
||||
- Handled exception when a provider has no secret in test connection [(#7283)](https://github.com/prowler-cloud/prowler/pull/7283).
|
||||
|
||||
|
||||
### Added
|
||||
|
||||
- Support for developing new integrations [(#7167)](https://github.com/prowler-cloud/prowler/pull/7167).
|
||||
|
||||
---
|
||||
|
||||
## [v1.5.0] (Prowler v5.4.0)
|
||||
|
||||
@@ -1,20 +1,21 @@
|
||||
FROM python:3.12.10-slim-bookworm AS build
|
||||
FROM python:3.12-slim-bookworm AS build
|
||||
|
||||
LABEL maintainer="https://github.com/prowler-cloud/api"
|
||||
|
||||
ARG POWERSHELL_VERSION=7.5.0
|
||||
ENV POWERSHELL_VERSION=${POWERSHELL_VERSION}
|
||||
|
||||
# hadolint ignore=DL3008
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends wget libicu72 \
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
gcc python3-dev bash \
|
||||
libcurl4-openssl-dev ca-certificates less ncurses-base \
|
||||
krb5-multidev libgcc-s1 gettext libssl3 \
|
||||
libstdc++6 tzdata liburcu8 zlib1g libicu72 curl openssh-client \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Install PowerShell
|
||||
RUN ARCH=$(uname -m) && \
|
||||
if [ "$ARCH" = "x86_64" ]; then \
|
||||
wget --progress=dot:giga https://github.com/PowerShell/PowerShell/releases/download/v${POWERSHELL_VERSION}/powershell-${POWERSHELL_VERSION}-linux-x64.tar.gz -O /tmp/powershell.tar.gz ; \
|
||||
curl -L https://github.com/PowerShell/PowerShell/releases/download/v7.5.0/powershell-7.5.0-linux-x64.tar.gz -o /tmp/powershell.tar.gz ; \
|
||||
elif [ "$ARCH" = "aarch64" ]; then \
|
||||
wget --progress=dot:giga https://github.com/PowerShell/PowerShell/releases/download/v${POWERSHELL_VERSION}/powershell-${POWERSHELL_VERSION}-linux-arm64.tar.gz -O /tmp/powershell.tar.gz ; \
|
||||
curl -L https://github.com/PowerShell/PowerShell/releases/download/v7.5.0/powershell-7.5.0-linux-arm64.tar.gz -o /tmp/powershell.tar.gz ; \
|
||||
else \
|
||||
echo "Unsupported architecture: $ARCH" && exit 1 ; \
|
||||
fi && \
|
||||
@@ -47,13 +48,19 @@ RUN poetry install --no-root && \
|
||||
|
||||
COPY docker-entrypoint.sh ./docker-entrypoint.sh
|
||||
|
||||
RUN poetry run python "$(poetry env info --path)/src/prowler/prowler/providers/m365/lib/powershell/m365_powershell.py"
|
||||
|
||||
WORKDIR /home/prowler/backend
|
||||
|
||||
# Development image
|
||||
FROM build AS dev
|
||||
|
||||
USER root
|
||||
# hadolint ignore=DL3008
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
curl vim \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
USER prowler
|
||||
|
||||
ENTRYPOINT ["../docker-entrypoint.sh", "dev"]
|
||||
|
||||
# Production image
|
||||
|
||||
@@ -235,7 +235,6 @@ To view the logs for any component (e.g., Django, Celery worker), you can use th
|
||||
|
||||
```console
|
||||
docker logs -f $(docker ps --format "{{.Names}}" | grep 'api-')
|
||||
```
|
||||
|
||||
## Applying migrations
|
||||
|
||||
|
||||
@@ -28,7 +28,7 @@ start_prod_server() {
|
||||
|
||||
start_worker() {
|
||||
echo "Starting the worker..."
|
||||
poetry run python -m celery -A config.celery worker -l "${DJANGO_LOGGING_LEVEL:-info}" -Q celery,scans,scan-reports,deletion,backfill -E --max-tasks-per-child 1
|
||||
poetry run python -m celery -A config.celery worker -l "${DJANGO_LOGGING_LEVEL:-info}" -Q celery,scans,scan-reports,deletion -E --max-tasks-per-child 1
|
||||
}
|
||||
|
||||
start_worker_beat() {
|
||||
|
||||
89
api/poetry.lock
generated
@@ -1,4 +1,4 @@
|
||||
# This file is automatically @generated by Poetry 2.1.3 and should not be changed by hand.
|
||||
# This file is automatically @generated by Poetry 2.1.1 and should not be changed by hand.
|
||||
|
||||
[[package]]
|
||||
name = "about-time"
|
||||
@@ -880,6 +880,7 @@ description = "Foreign Function Interface for Python calling C code."
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main", "dev"]
|
||||
markers = "platform_python_implementation != \"PyPy\""
|
||||
files = [
|
||||
{file = "cffi-1.17.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:df8b1c11f177bc2313ec4b2d46baec87a5f3e71fc8b45dab2ee7cae86d9aba14"},
|
||||
{file = "cffi-1.17.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8f2cdc858323644ab277e9bb925ad72ae0e67f69e804f4898c070998d50b1a67"},
|
||||
@@ -949,7 +950,6 @@ files = [
|
||||
{file = "cffi-1.17.1-cp39-cp39-win_amd64.whl", hash = "sha256:d016c76bdd850f3c626af19b0542c9677ba156e4ee4fccfdd7848803533ef662"},
|
||||
{file = "cffi-1.17.1.tar.gz", hash = "sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824"},
|
||||
]
|
||||
markers = {dev = "platform_python_implementation != \"PyPy\""}
|
||||
|
||||
[package.dependencies]
|
||||
pycparser = "*"
|
||||
@@ -1469,14 +1469,14 @@ with-social = ["django-allauth[socialaccount] (>=64.0.0)"]
|
||||
|
||||
[[package]]
|
||||
name = "django"
|
||||
version = "5.1.8"
|
||||
version = "5.1.7"
|
||||
description = "A high-level Python web framework that encourages rapid development and clean, pragmatic design."
|
||||
optional = false
|
||||
python-versions = ">=3.10"
|
||||
groups = ["main", "dev"]
|
||||
files = [
|
||||
{file = "Django-5.1.8-py3-none-any.whl", hash = "sha256:11b28fa4b00e59d0def004e9ee012fefbb1065a5beb39ee838983fd24493ad4f"},
|
||||
{file = "Django-5.1.8.tar.gz", hash = "sha256:42e92a1dd2810072bcc40a39a212b693f94406d0ba0749e68eb642f31dc770b4"},
|
||||
{file = "Django-5.1.7-py3-none-any.whl", hash = "sha256:1323617cb624add820cb9611cdcc788312d250824f92ca6048fda8625514af2b"},
|
||||
{file = "Django-5.1.7.tar.gz", hash = "sha256:30de4ee43a98e5d3da36a9002f287ff400b43ca51791920bfb35f6917bfe041c"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
@@ -2237,14 +2237,14 @@ tornado = ["tornado (>=0.2)"]
|
||||
|
||||
[[package]]
|
||||
name = "h11"
|
||||
version = "0.16.0"
|
||||
version = "0.14.0"
|
||||
description = "A pure-Python, bring-your-own-I/O implementation of HTTP/1.1"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
python-versions = ">=3.7"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86"},
|
||||
{file = "h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1"},
|
||||
{file = "h11-0.14.0-py3-none-any.whl", hash = "sha256:e3fe4ac4b851c468cc8363d500db52c2ead036020723024a109d37346efaa761"},
|
||||
{file = "h11-0.14.0.tar.gz", hash = "sha256:8f19fbbe99e72420ff35c00b27a34cb9937e902a8b810e2c88300c6f0a3b699d"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -2277,19 +2277,19 @@ files = [
|
||||
|
||||
[[package]]
|
||||
name = "httpcore"
|
||||
version = "1.0.9"
|
||||
version = "1.0.8"
|
||||
description = "A minimal low-level HTTP client."
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55"},
|
||||
{file = "httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8"},
|
||||
{file = "httpcore-1.0.8-py3-none-any.whl", hash = "sha256:5254cf149bcb5f75e9d1b2b9f729ea4a4b883d1ad7379fc632b727cec23674be"},
|
||||
{file = "httpcore-1.0.8.tar.gz", hash = "sha256:86e94505ed24ea06514883fd44d2bc02d90e77e7979c8eb71b90f41d364a1bad"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
certifi = "*"
|
||||
h11 = ">=0.16"
|
||||
h11 = ">=0.13,<0.15"
|
||||
|
||||
[package.extras]
|
||||
asyncio = ["anyio (>=4.0,<5.0)"]
|
||||
@@ -3597,7 +3597,7 @@ files = [
|
||||
|
||||
[[package]]
|
||||
name = "prowler"
|
||||
version = "5.7.0"
|
||||
version = "5.6.0"
|
||||
description = "Prowler is an Open Source security tool to perform AWS, GCP and Azure security best practices assessments, audits, incident response, continuous monitoring, hardening and forensics readiness. It contains hundreds of controls covering CIS, NIST 800, NIST CSF, CISA, RBI, FedRAMP, PCI-DSS, GDPR, HIPAA, FFIEC, SOC2, GXP, AWS Well-Architected Framework Security Pillar, AWS Foundational Technical Review (FTR), ENS (Spanish National Security Scheme) and your custom security frameworks."
|
||||
optional = false
|
||||
python-versions = ">3.9.1,<3.13"
|
||||
@@ -3645,7 +3645,6 @@ numpy = "2.0.2"
|
||||
pandas = "2.2.3"
|
||||
py-ocsf-models = "0.3.1"
|
||||
pydantic = "1.10.21"
|
||||
pygithub = "2.5.0"
|
||||
python-dateutil = ">=2.9.0.post0,<3.0.0"
|
||||
pytz = "2025.1"
|
||||
schema = "0.7.7"
|
||||
@@ -3657,8 +3656,8 @@ tzlocal = "5.3.1"
|
||||
[package.source]
|
||||
type = "git"
|
||||
url = "https://github.com/prowler-cloud/prowler.git"
|
||||
reference = "v5.7"
|
||||
resolved_reference = "a3b606fc7124ce94f27ed2fd2ba8ad8f734a69d1"
|
||||
reference = "master"
|
||||
resolved_reference = "0edf19928269b6d42d1c463ab13b90d7c21a65e4"
|
||||
|
||||
[[package]]
|
||||
name = "psutil"
|
||||
@@ -3835,11 +3834,11 @@ description = "C parser in Python"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main", "dev"]
|
||||
markers = "platform_python_implementation != \"PyPy\""
|
||||
files = [
|
||||
{file = "pycparser-2.22-py3-none-any.whl", hash = "sha256:c3702b6d3dd8c7abc1afa565d7e63d53a1d0bd86cdc24edd75470f4de499cfcc"},
|
||||
{file = "pycparser-2.22.tar.gz", hash = "sha256:491c8be9c040f5390f5bf44a5b07752bd07f56edf992381b05c701439eec10f6"},
|
||||
]
|
||||
markers = {dev = "platform_python_implementation != \"PyPy\""}
|
||||
|
||||
[[package]]
|
||||
name = "pycurl"
|
||||
@@ -3950,26 +3949,6 @@ typing-extensions = ">=4.2.0"
|
||||
dotenv = ["python-dotenv (>=0.10.4)"]
|
||||
email = ["email-validator (>=1.0.3)"]
|
||||
|
||||
[[package]]
|
||||
name = "pygithub"
|
||||
version = "2.5.0"
|
||||
description = "Use the full Github API v3"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "PyGithub-2.5.0-py3-none-any.whl", hash = "sha256:b0b635999a658ab8e08720bdd3318893ff20e2275f6446fcf35bf3f44f2c0fd2"},
|
||||
{file = "pygithub-2.5.0.tar.gz", hash = "sha256:e1613ac508a9be710920d26eb18b1905ebd9926aa49398e88151c1b526aad3cf"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
Deprecated = "*"
|
||||
pyjwt = {version = ">=2.4.0", extras = ["crypto"]}
|
||||
pynacl = ">=1.4.0"
|
||||
requests = ">=2.14.0"
|
||||
typing-extensions = ">=4.0.0"
|
||||
urllib3 = ">=1.26.0"
|
||||
|
||||
[[package]]
|
||||
name = "pygments"
|
||||
version = "2.19.1"
|
||||
@@ -4034,33 +4013,6 @@ tomlkit = ">=0.10.1"
|
||||
spelling = ["pyenchant (>=3.2,<4.0)"]
|
||||
testutils = ["gitpython (>3)"]
|
||||
|
||||
[[package]]
|
||||
name = "pynacl"
|
||||
version = "1.5.0"
|
||||
description = "Python binding to the Networking and Cryptography (NaCl) library"
|
||||
optional = false
|
||||
python-versions = ">=3.6"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "PyNaCl-1.5.0-cp36-abi3-macosx_10_10_universal2.whl", hash = "sha256:401002a4aaa07c9414132aaed7f6836ff98f59277a234704ff66878c2ee4a0d1"},
|
||||
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:52cb72a79269189d4e0dc537556f4740f7f0a9ec41c1322598799b0bdad4ef92"},
|
||||
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a36d4a9dda1f19ce6e03c9a784a2921a4b726b02e1c736600ca9c22029474394"},
|
||||
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:0c84947a22519e013607c9be43706dd42513f9e6ae5d39d3613ca1e142fba44d"},
|
||||
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:06b8f6fa7f5de8d5d2f7573fe8c863c051225a27b61e6860fd047b1775807858"},
|
||||
{file = "PyNaCl-1.5.0-cp36-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:a422368fc821589c228f4c49438a368831cb5bbc0eab5ebe1d7fac9dded6567b"},
|
||||
{file = "PyNaCl-1.5.0-cp36-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:61f642bf2378713e2c2e1de73444a3778e5f0a38be6fee0fe532fe30060282ff"},
|
||||
{file = "PyNaCl-1.5.0-cp36-abi3-win32.whl", hash = "sha256:e46dae94e34b085175f8abb3b0aaa7da40767865ac82c928eeb9e57e1ea8a543"},
|
||||
{file = "PyNaCl-1.5.0-cp36-abi3-win_amd64.whl", hash = "sha256:20f42270d27e1b6a29f54032090b972d97f0a1b0948cc52392041ef7831fee93"},
|
||||
{file = "PyNaCl-1.5.0.tar.gz", hash = "sha256:8ac7448f09ab85811607bdd21ec2464495ac8b7c66d146bf545b0f08fb9220ba"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
cffi = ">=1.4.1"
|
||||
|
||||
[package.extras]
|
||||
docs = ["sphinx (>=1.6.5)", "sphinx-rtd-theme"]
|
||||
tests = ["hypothesis (>=3.27.0)", "pytest (>=3.2.1,!=3.3.0)"]
|
||||
|
||||
[[package]]
|
||||
name = "pyparsing"
|
||||
version = "3.2.3"
|
||||
@@ -4685,7 +4637,6 @@ files = [
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f66efbc1caa63c088dead1c4170d148eabc9b80d95fb75b6c92ac0aad2437d76"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:22353049ba4181685023b25b5b51a574bce33e7f51c759371a7422dcae5402a6"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:932205970b9f9991b34f55136be327501903f7c66830e9760a8ffb15b07f05cd"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:a52d48f4e7bf9005e8f0a89209bf9a73f7190ddf0489eee5eb51377385f59f2a"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-win32.whl", hash = "sha256:3eac5a91891ceb88138c113f9db04f3cebdae277f5d44eaa3651a4f573e6a5da"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-win_amd64.whl", hash = "sha256:ab007f2f5a87bd08ab1499bdf96f3d5c6ad4dcfa364884cb4549aa0154b13a28"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-macosx_13_0_arm64.whl", hash = "sha256:4a6679521a58256a90b0d89e03992c15144c5f3858f40d7c18886023d7943db6"},
|
||||
@@ -4694,7 +4645,6 @@ files = [
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:811ea1594b8a0fb466172c384267a4e5e367298af6b228931f273b111f17ef52"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:cf12567a7b565cbf65d438dec6cfbe2917d3c1bdddfce84a9930b7d35ea59642"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:7dd5adc8b930b12c8fc5b99e2d535a09889941aa0d0bd06f4749e9a9397c71d2"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:1492a6051dab8d912fc2adeef0e8c72216b24d57bd896ea607cb90bb0c4981d3"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-win32.whl", hash = "sha256:bd0a08f0bab19093c54e18a14a10b4322e1eacc5217056f3c063bd2f59853ce4"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-win_amd64.whl", hash = "sha256:a274fb2cb086c7a3dea4322ec27f4cb5cc4b6298adb583ab0e211a4682f241eb"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:20b0f8dc160ba83b6dcc0e256846e1a02d044e13f7ea74a3d1d56ede4e48c632"},
|
||||
@@ -4703,7 +4653,6 @@ files = [
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:749c16fcc4a2b09f28843cda5a193e0283e47454b63ec4b81eaa2242f50e4ccd"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:bf165fef1f223beae7333275156ab2022cffe255dcc51c27f066b4370da81e31"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:32621c177bbf782ca5a18ba4d7af0f1082a3f6e517ac2a18b3974d4edf349680"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:b82a7c94a498853aa0b272fd5bc67f29008da798d4f93a2f9f289feb8426a58d"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-win32.whl", hash = "sha256:e8c4ebfcfd57177b572e2040777b8abc537cdef58a2120e830124946aa9b42c5"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-win_amd64.whl", hash = "sha256:0467c5965282c62203273b838ae77c0d29d7638c8a4e3a1c8bdd3602c10904e4"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:4c8c5d82f50bb53986a5e02d1b3092b03622c02c2eb78e29bec33fd9593bae1a"},
|
||||
@@ -4712,7 +4661,6 @@ files = [
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:96777d473c05ee3e5e3c3e999f5d23c6f4ec5b0c38c098b3a5229085f74236c6"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-musllinux_1_1_i686.whl", hash = "sha256:3bc2a80e6420ca8b7d3590791e2dfc709c88ab9152c00eeb511c9875ce5778bf"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:e188d2699864c11c36cdfdada94d781fd5d6b0071cd9c427bceb08ad3d7c70e1"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4f6f3eac23941b32afccc23081e1f50612bdbe4e982012ef4f5797986828cd01"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-win32.whl", hash = "sha256:6442cb36270b3afb1b4951f060eccca1ce49f3d087ca1ca4563a6eb479cb3de6"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-win_amd64.whl", hash = "sha256:e5b8daf27af0b90da7bb903a876477a9e6d7270be6146906b276605997c7e9a3"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-macosx_12_0_arm64.whl", hash = "sha256:fc4b630cd3fa2cf7fce38afa91d7cfe844a9f75d7f0f36393fa98815e911d987"},
|
||||
@@ -4721,7 +4669,6 @@ files = [
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e2f1c3765db32be59d18ab3953f43ab62a761327aafc1594a2a1fbe038b8b8a7"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:d85252669dc32f98ebcd5d36768f5d4faeaeaa2d655ac0473be490ecdae3c285"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:e143ada795c341b56de9418c58d028989093ee611aa27ffb9b7f609c00d813ed"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:2c59aa6170b990d8d2719323e628aaf36f3bfbc1c26279c0eeeb24d05d2d11c7"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-win32.whl", hash = "sha256:beffaed67936fbbeffd10966a4eb53c402fafd3d6833770516bf7314bc6ffa12"},
|
||||
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-win_amd64.whl", hash = "sha256:040ae85536960525ea62868b642bdb0c2cc6021c9f9d507810c0c604e66f5a7b"},
|
||||
{file = "ruamel.yaml.clib-0.2.12.tar.gz", hash = "sha256:6c8fbb13ec503f99a91901ab46e0b07ae7941cd527393187039aec586fdfd36f"},
|
||||
@@ -5536,4 +5483,4 @@ type = ["pytest-mypy"]
|
||||
[metadata]
|
||||
lock-version = "2.1"
|
||||
python-versions = ">=3.11,<3.13"
|
||||
content-hash = "db1beb68c9757678759b79a515ff19a21b1201502c1e7c24f579ccc47aef8644"
|
||||
content-hash = "f3ede96fb76c14d02da37c1132b41a14fa4045787807446fac2cd1750be193e0"
|
||||
|
||||
@@ -7,7 +7,7 @@ authors = [{name = "Prowler Engineering", email = "engineering@prowler.com"}]
|
||||
dependencies = [
|
||||
"celery[pytest] (>=5.4.0,<6.0.0)",
|
||||
"dj-rest-auth[with_social,jwt] (==7.0.1)",
|
||||
"django==5.1.8",
|
||||
"django==5.1.7",
|
||||
"django-allauth==65.4.1",
|
||||
"django-celery-beat (>=2.7.0,<3.0.0)",
|
||||
"django-celery-results (>=2.5.1,<3.0.0)",
|
||||
@@ -23,7 +23,7 @@ dependencies = [
|
||||
"drf-spectacular==0.27.2",
|
||||
"drf-spectacular-jsonapi==0.5.1",
|
||||
"gunicorn==23.0.0",
|
||||
"prowler @ git+https://github.com/prowler-cloud/prowler.git@v5.7",
|
||||
"prowler @ git+https://github.com/prowler-cloud/prowler.git@master",
|
||||
"psycopg2-binary==2.9.9",
|
||||
"pytest-celery[redis] (>=1.0.1,<2.0.0)",
|
||||
"sentry-sdk[django] (>=2.20.0,<3.0.0)",
|
||||
@@ -35,7 +35,7 @@ name = "prowler-api"
|
||||
package-mode = false
|
||||
# Needed for the SDK compatibility
|
||||
requires-python = ">=3.11,<3.13"
|
||||
version = "1.8.1"
|
||||
version = "1.6.0"
|
||||
|
||||
[project.scripts]
|
||||
celery = "src.backend.config.settings.celery"
|
||||
|
||||
@@ -1,38 +1,12 @@
|
||||
from types import MappingProxyType
|
||||
|
||||
from api.models import Provider
|
||||
from prowler.config.config import get_available_compliance_frameworks
|
||||
from prowler.lib.check.compliance_models import Compliance
|
||||
from prowler.lib.check.models import CheckMetadata
|
||||
|
||||
from api.models import Provider
|
||||
|
||||
PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE = {}
|
||||
PROWLER_CHECKS = {}
|
||||
AVAILABLE_COMPLIANCE_FRAMEWORKS = {}
|
||||
|
||||
|
||||
def get_compliance_frameworks(provider_type: Provider.ProviderChoices) -> list[str]:
|
||||
"""
|
||||
Retrieve and cache the list of available compliance frameworks for a specific cloud provider.
|
||||
|
||||
This function lazily loads and caches the available compliance frameworks (e.g., CIS, MITRE, ISO)
|
||||
for each provider type (AWS, Azure, GCP, etc.) on first access. Subsequent calls for the same
|
||||
provider will return the cached result.
|
||||
|
||||
Args:
|
||||
provider_type (Provider.ProviderChoices): The cloud provider type for which to retrieve
|
||||
available compliance frameworks (e.g., "aws", "azure", "gcp", "m365").
|
||||
|
||||
Returns:
|
||||
list[str]: A list of framework identifiers (e.g., "cis_1.4_aws", "mitre_attack_azure") available
|
||||
for the given provider.
|
||||
"""
|
||||
global AVAILABLE_COMPLIANCE_FRAMEWORKS
|
||||
if provider_type not in AVAILABLE_COMPLIANCE_FRAMEWORKS:
|
||||
AVAILABLE_COMPLIANCE_FRAMEWORKS[provider_type] = (
|
||||
get_available_compliance_frameworks(provider_type)
|
||||
)
|
||||
|
||||
return AVAILABLE_COMPLIANCE_FRAMEWORKS[provider_type]
|
||||
|
||||
|
||||
def get_prowler_provider_checks(provider_type: Provider.ProviderChoices):
|
||||
|
||||
@@ -227,77 +227,6 @@ def register_enum(apps, schema_editor, enum_class): # noqa: F841
|
||||
register_adapter(enum_class, enum_adapter)
|
||||
|
||||
|
||||
def create_index_on_partitions(
|
||||
apps, # noqa: F841
|
||||
schema_editor,
|
||||
parent_table: str,
|
||||
index_name: str,
|
||||
columns: str,
|
||||
method: str = "BTREE",
|
||||
where: str = "",
|
||||
):
|
||||
"""
|
||||
Create an index on every existing partition of `parent_table`.
|
||||
|
||||
Args:
|
||||
parent_table: The name of the root table (e.g. "findings").
|
||||
index_name: A short name for the index (will be prefixed per-partition).
|
||||
columns: The parenthesized column list, e.g. "tenant_id, scan_id, status".
|
||||
method: The index method—BTREE, GIN, etc. Defaults to BTREE.
|
||||
where: Optional WHERE clause (without the leading "WHERE"), e.g. "status = 'FAIL'".
|
||||
"""
|
||||
with connection.cursor() as cursor:
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT inhrelid::regclass::text
|
||||
FROM pg_inherits
|
||||
WHERE inhparent = %s::regclass
|
||||
""",
|
||||
[parent_table],
|
||||
)
|
||||
partitions = [row[0] for row in cursor.fetchall()]
|
||||
|
||||
where_sql = f" WHERE {where}" if where else ""
|
||||
for partition in partitions:
|
||||
idx_name = f"{partition.replace('.', '_')}_{index_name}"
|
||||
sql = (
|
||||
f"CREATE INDEX CONCURRENTLY IF NOT EXISTS {idx_name} "
|
||||
f"ON {partition} USING {method} ({columns})"
|
||||
f"{where_sql};"
|
||||
)
|
||||
schema_editor.execute(sql)
|
||||
|
||||
|
||||
def drop_index_on_partitions(
|
||||
apps, # noqa: F841
|
||||
schema_editor,
|
||||
parent_table: str,
|
||||
index_name: str,
|
||||
):
|
||||
"""
|
||||
Drop the per-partition indexes that were created by create_index_on_partitions.
|
||||
|
||||
Args:
|
||||
parent_table: The name of the root table (e.g. "findings").
|
||||
index_name: The same short name used when creating them.
|
||||
"""
|
||||
with connection.cursor() as cursor:
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT inhrelid::regclass::text
|
||||
FROM pg_inherits
|
||||
WHERE inhparent = %s::regclass
|
||||
""",
|
||||
[parent_table],
|
||||
)
|
||||
partitions = [row[0] for row in cursor.fetchall()]
|
||||
|
||||
for partition in partitions:
|
||||
idx_name = f"{partition.replace('.', '_')}_{index_name}"
|
||||
sql = f"DROP INDEX CONCURRENTLY IF EXISTS {idx_name};"
|
||||
schema_editor.execute(sql)
|
||||
|
||||
|
||||
# Postgres enum definition for member role
|
||||
|
||||
|
||||
|
||||
@@ -81,114 +81,6 @@ class ChoiceInFilter(BaseInFilter, ChoiceFilter):
|
||||
pass
|
||||
|
||||
|
||||
class CommonFindingFilters(FilterSet):
|
||||
# We filter providers from the scan in findings
|
||||
provider = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
|
||||
provider__in = UUIDInFilter(field_name="scan__provider__id", lookup_expr="in")
|
||||
provider_type = ChoiceFilter(
|
||||
choices=Provider.ProviderChoices.choices, field_name="scan__provider__provider"
|
||||
)
|
||||
provider_type__in = ChoiceInFilter(
|
||||
choices=Provider.ProviderChoices.choices, field_name="scan__provider__provider"
|
||||
)
|
||||
provider_uid = CharFilter(field_name="scan__provider__uid", lookup_expr="exact")
|
||||
provider_uid__in = CharInFilter(field_name="scan__provider__uid", lookup_expr="in")
|
||||
provider_uid__icontains = CharFilter(
|
||||
field_name="scan__provider__uid", lookup_expr="icontains"
|
||||
)
|
||||
provider_alias = CharFilter(field_name="scan__provider__alias", lookup_expr="exact")
|
||||
provider_alias__in = CharInFilter(
|
||||
field_name="scan__provider__alias", lookup_expr="in"
|
||||
)
|
||||
provider_alias__icontains = CharFilter(
|
||||
field_name="scan__provider__alias", lookup_expr="icontains"
|
||||
)
|
||||
|
||||
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
|
||||
|
||||
uid = CharFilter(field_name="uid")
|
||||
delta = ChoiceFilter(choices=Finding.DeltaChoices.choices)
|
||||
status = ChoiceFilter(choices=StatusChoices.choices)
|
||||
severity = ChoiceFilter(choices=SeverityChoices)
|
||||
impact = ChoiceFilter(choices=SeverityChoices)
|
||||
muted = BooleanFilter(
|
||||
help_text="If this filter is not provided, muted and non-muted findings will be returned."
|
||||
)
|
||||
|
||||
resources = UUIDInFilter(field_name="resource__id", lookup_expr="in")
|
||||
|
||||
region = CharFilter(method="filter_resource_region")
|
||||
region__in = CharInFilter(field_name="resource_regions", lookup_expr="overlap")
|
||||
region__icontains = CharFilter(
|
||||
field_name="resource_regions", lookup_expr="icontains"
|
||||
)
|
||||
|
||||
service = CharFilter(method="filter_resource_service")
|
||||
service__in = CharInFilter(field_name="resource_services", lookup_expr="overlap")
|
||||
service__icontains = CharFilter(
|
||||
field_name="resource_services", lookup_expr="icontains"
|
||||
)
|
||||
|
||||
resource_uid = CharFilter(field_name="resources__uid")
|
||||
resource_uid__in = CharInFilter(field_name="resources__uid", lookup_expr="in")
|
||||
resource_uid__icontains = CharFilter(
|
||||
field_name="resources__uid", lookup_expr="icontains"
|
||||
)
|
||||
|
||||
resource_name = CharFilter(field_name="resources__name")
|
||||
resource_name__in = CharInFilter(field_name="resources__name", lookup_expr="in")
|
||||
resource_name__icontains = CharFilter(
|
||||
field_name="resources__name", lookup_expr="icontains"
|
||||
)
|
||||
|
||||
resource_type = CharFilter(method="filter_resource_type")
|
||||
resource_type__in = CharInFilter(field_name="resource_types", lookup_expr="overlap")
|
||||
resource_type__icontains = CharFilter(
|
||||
field_name="resources__type", lookup_expr="icontains"
|
||||
)
|
||||
|
||||
# Temporarily disabled until we implement tag filtering in the UI
|
||||
# resource_tag_key = CharFilter(field_name="resources__tags__key")
|
||||
# resource_tag_key__in = CharInFilter(
|
||||
# field_name="resources__tags__key", lookup_expr="in"
|
||||
# )
|
||||
# resource_tag_key__icontains = CharFilter(
|
||||
# field_name="resources__tags__key", lookup_expr="icontains"
|
||||
# )
|
||||
# resource_tag_value = CharFilter(field_name="resources__tags__value")
|
||||
# resource_tag_value__in = CharInFilter(
|
||||
# field_name="resources__tags__value", lookup_expr="in"
|
||||
# )
|
||||
# resource_tag_value__icontains = CharFilter(
|
||||
# field_name="resources__tags__value", lookup_expr="icontains"
|
||||
# )
|
||||
# resource_tags = CharInFilter(
|
||||
# method="filter_resource_tag",
|
||||
# lookup_expr="in",
|
||||
# help_text="Filter by resource tags `key:value` pairs.\nMultiple values may be "
|
||||
# "separated by commas.",
|
||||
# )
|
||||
|
||||
def filter_resource_service(self, queryset, name, value):
|
||||
return queryset.filter(resource_services__contains=[value])
|
||||
|
||||
def filter_resource_region(self, queryset, name, value):
|
||||
return queryset.filter(resource_regions__contains=[value])
|
||||
|
||||
def filter_resource_type(self, queryset, name, value):
|
||||
return queryset.filter(resource_types__contains=[value])
|
||||
|
||||
def filter_resource_tag(self, queryset, name, value):
|
||||
overall_query = Q()
|
||||
for key_value_pair in value:
|
||||
tag_key, tag_value = key_value_pair.split(":", 1)
|
||||
overall_query |= Q(
|
||||
resources__tags__key__icontains=tag_key,
|
||||
resources__tags__value__icontains=tag_value,
|
||||
)
|
||||
return queryset.filter(overall_query).distinct()
|
||||
|
||||
|
||||
class TenantFilter(FilterSet):
|
||||
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
|
||||
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
|
||||
@@ -365,7 +257,94 @@ class ResourceFilter(ProviderRelationshipFilterSet):
|
||||
return queryset.filter(tags__text_search=value)
|
||||
|
||||
|
||||
class FindingFilter(CommonFindingFilters):
|
||||
class FindingFilter(FilterSet):
|
||||
# We filter providers from the scan in findings
|
||||
provider = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
|
||||
provider__in = UUIDInFilter(field_name="scan__provider__id", lookup_expr="in")
|
||||
provider_type = ChoiceFilter(
|
||||
choices=Provider.ProviderChoices.choices, field_name="scan__provider__provider"
|
||||
)
|
||||
provider_type__in = ChoiceInFilter(
|
||||
choices=Provider.ProviderChoices.choices, field_name="scan__provider__provider"
|
||||
)
|
||||
provider_uid = CharFilter(field_name="scan__provider__uid", lookup_expr="exact")
|
||||
provider_uid__in = CharInFilter(field_name="scan__provider__uid", lookup_expr="in")
|
||||
provider_uid__icontains = CharFilter(
|
||||
field_name="scan__provider__uid", lookup_expr="icontains"
|
||||
)
|
||||
provider_alias = CharFilter(field_name="scan__provider__alias", lookup_expr="exact")
|
||||
provider_alias__in = CharInFilter(
|
||||
field_name="scan__provider__alias", lookup_expr="in"
|
||||
)
|
||||
provider_alias__icontains = CharFilter(
|
||||
field_name="scan__provider__alias", lookup_expr="icontains"
|
||||
)
|
||||
|
||||
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
|
||||
|
||||
uid = CharFilter(field_name="uid")
|
||||
delta = ChoiceFilter(choices=Finding.DeltaChoices.choices)
|
||||
status = ChoiceFilter(choices=StatusChoices.choices)
|
||||
severity = ChoiceFilter(choices=SeverityChoices)
|
||||
impact = ChoiceFilter(choices=SeverityChoices)
|
||||
muted = BooleanFilter(
|
||||
help_text="If this filter is not provided, muted and non-muted findings will be returned."
|
||||
)
|
||||
|
||||
resources = UUIDInFilter(field_name="resource__id", lookup_expr="in")
|
||||
|
||||
region = CharFilter(field_name="resources__region")
|
||||
region__in = CharInFilter(field_name="resources__region", lookup_expr="in")
|
||||
region__icontains = CharFilter(
|
||||
field_name="resources__region", lookup_expr="icontains"
|
||||
)
|
||||
|
||||
service = CharFilter(field_name="resources__service")
|
||||
service__in = CharInFilter(field_name="resources__service", lookup_expr="in")
|
||||
service__icontains = CharFilter(
|
||||
field_name="resources__service", lookup_expr="icontains"
|
||||
)
|
||||
|
||||
resource_uid = CharFilter(field_name="resources__uid")
|
||||
resource_uid__in = CharInFilter(field_name="resources__uid", lookup_expr="in")
|
||||
resource_uid__icontains = CharFilter(
|
||||
field_name="resources__uid", lookup_expr="icontains"
|
||||
)
|
||||
|
||||
resource_name = CharFilter(field_name="resources__name")
|
||||
resource_name__in = CharInFilter(field_name="resources__name", lookup_expr="in")
|
||||
resource_name__icontains = CharFilter(
|
||||
field_name="resources__name", lookup_expr="icontains"
|
||||
)
|
||||
|
||||
resource_type = CharFilter(field_name="resources__type")
|
||||
resource_type__in = CharInFilter(field_name="resources__type", lookup_expr="in")
|
||||
resource_type__icontains = CharFilter(
|
||||
field_name="resources__type", lookup_expr="icontains"
|
||||
)
|
||||
|
||||
# Temporarily disabled until we implement tag filtering in the UI
|
||||
# resource_tag_key = CharFilter(field_name="resources__tags__key")
|
||||
# resource_tag_key__in = CharInFilter(
|
||||
# field_name="resources__tags__key", lookup_expr="in"
|
||||
# )
|
||||
# resource_tag_key__icontains = CharFilter(
|
||||
# field_name="resources__tags__key", lookup_expr="icontains"
|
||||
# )
|
||||
# resource_tag_value = CharFilter(field_name="resources__tags__value")
|
||||
# resource_tag_value__in = CharInFilter(
|
||||
# field_name="resources__tags__value", lookup_expr="in"
|
||||
# )
|
||||
# resource_tag_value__icontains = CharFilter(
|
||||
# field_name="resources__tags__value", lookup_expr="icontains"
|
||||
# )
|
||||
# resource_tags = CharInFilter(
|
||||
# method="filter_resource_tag",
|
||||
# lookup_expr="in",
|
||||
# help_text="Filter by resource tags `key:value` pairs.\nMultiple values may be "
|
||||
# "separated by commas.",
|
||||
# )
|
||||
|
||||
scan = UUIDFilter(method="filter_scan_id")
|
||||
scan__in = UUIDInFilter(method="filter_scan_id_in")
|
||||
|
||||
@@ -406,15 +385,6 @@ class FindingFilter(CommonFindingFilters):
|
||||
},
|
||||
}
|
||||
|
||||
def filter_resource_type(self, queryset, name, value):
|
||||
return queryset.filter(resource_types__contains=[value])
|
||||
|
||||
def filter_resource_region(self, queryset, name, value):
|
||||
return queryset.filter(resource_regions__contains=[value])
|
||||
|
||||
def filter_resource_service(self, queryset, name, value):
|
||||
return queryset.filter(resource_services__contains=[value])
|
||||
|
||||
def filter_queryset(self, queryset):
|
||||
if not (self.data.get("scan") or self.data.get("scan__in")) and not (
|
||||
self.data.get("inserted_at")
|
||||
@@ -533,6 +503,16 @@ class FindingFilter(CommonFindingFilters):
|
||||
|
||||
return queryset.filter(id__lt=end)
|
||||
|
||||
def filter_resource_tag(self, queryset, name, value):
|
||||
overall_query = Q()
|
||||
for key_value_pair in value:
|
||||
tag_key, tag_value = key_value_pair.split(":", 1)
|
||||
overall_query |= Q(
|
||||
resources__tags__key__icontains=tag_key,
|
||||
resources__tags__value__icontains=tag_value,
|
||||
)
|
||||
return queryset.filter(overall_query).distinct()
|
||||
|
||||
@staticmethod
|
||||
def maybe_date_to_datetime(value):
|
||||
dt = value
|
||||
@@ -541,31 +521,6 @@ class FindingFilter(CommonFindingFilters):
|
||||
return dt
|
||||
|
||||
|
||||
class LatestFindingFilter(CommonFindingFilters):
|
||||
class Meta:
|
||||
model = Finding
|
||||
fields = {
|
||||
"id": ["exact", "in"],
|
||||
"uid": ["exact", "in"],
|
||||
"delta": ["exact", "in"],
|
||||
"status": ["exact", "in"],
|
||||
"severity": ["exact", "in"],
|
||||
"impact": ["exact", "in"],
|
||||
"check_id": ["exact", "in", "icontains"],
|
||||
}
|
||||
filter_overrides = {
|
||||
FindingDeltaEnumField: {
|
||||
"filter_class": CharFilter,
|
||||
},
|
||||
StatusEnumField: {
|
||||
"filter_class": CharFilter,
|
||||
},
|
||||
SeverityEnumField: {
|
||||
"filter_class": CharFilter,
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
class ProviderSecretFilter(FilterSet):
|
||||
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
|
||||
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
|
||||
|
||||
@@ -12,7 +12,6 @@ from api.models import (
|
||||
Provider,
|
||||
Resource,
|
||||
ResourceFindingMapping,
|
||||
ResourceScanSummary,
|
||||
Scan,
|
||||
StatusChoices,
|
||||
)
|
||||
@@ -134,7 +133,6 @@ class Command(BaseCommand):
|
||||
region=random.choice(possible_regions),
|
||||
service=random.choice(possible_services),
|
||||
type=random.choice(possible_types),
|
||||
inserted_at="2024-10-01T00:00:00Z",
|
||||
)
|
||||
)
|
||||
|
||||
@@ -183,10 +181,6 @@ class Command(BaseCommand):
|
||||
"servicename": assigned_resource.service,
|
||||
"resourcetype": assigned_resource.type,
|
||||
},
|
||||
resource_types=[assigned_resource.type],
|
||||
resource_regions=[assigned_resource.region],
|
||||
resource_services=[assigned_resource.service],
|
||||
inserted_at="2024-10-01T00:00:00Z",
|
||||
)
|
||||
)
|
||||
|
||||
@@ -203,22 +197,12 @@ class Command(BaseCommand):
|
||||
|
||||
# Create ResourceFindingMapping
|
||||
mappings = []
|
||||
scan_resource_cache: set[tuple] = set()
|
||||
for index, finding_instance in enumerate(findings):
|
||||
resource_instance = resources[findings_resources_mapping[index]]
|
||||
for index, f in enumerate(findings):
|
||||
mappings.append(
|
||||
ResourceFindingMapping(
|
||||
tenant_id=tenant_id,
|
||||
resource=resource_instance,
|
||||
finding=finding_instance,
|
||||
)
|
||||
)
|
||||
scan_resource_cache.add(
|
||||
(
|
||||
str(resource_instance.id),
|
||||
resource_instance.service,
|
||||
resource_instance.region,
|
||||
resource_instance.type,
|
||||
resource=resources[findings_resources_mapping[index]],
|
||||
finding=f,
|
||||
)
|
||||
)
|
||||
|
||||
@@ -236,38 +220,6 @@ class Command(BaseCommand):
|
||||
"Resource-finding mappings created successfully.\n\n"
|
||||
)
|
||||
)
|
||||
|
||||
with rls_transaction(tenant_id):
|
||||
scan.progress = 99
|
||||
scan.save()
|
||||
|
||||
self.stdout.write(self.style.WARNING("Creating finding filter values..."))
|
||||
resource_scan_summaries = [
|
||||
ResourceScanSummary(
|
||||
tenant_id=tenant_id,
|
||||
scan_id=str(scan.id),
|
||||
resource_id=resource_id,
|
||||
service=service,
|
||||
region=region,
|
||||
resource_type=resource_type,
|
||||
)
|
||||
for resource_id, service, region, resource_type in scan_resource_cache
|
||||
]
|
||||
num_batches = ceil(len(resource_scan_summaries) / batch_size)
|
||||
with rls_transaction(tenant_id):
|
||||
for i in tqdm(
|
||||
range(0, len(resource_scan_summaries), batch_size),
|
||||
total=num_batches,
|
||||
):
|
||||
with rls_transaction(tenant_id):
|
||||
ResourceScanSummary.objects.bulk_create(
|
||||
resource_scan_summaries[i : i + batch_size],
|
||||
ignore_conflicts=True,
|
||||
)
|
||||
|
||||
self.stdout.write(
|
||||
self.style.SUCCESS("Finding filter values created successfully.\n\n")
|
||||
)
|
||||
except Exception as e:
|
||||
self.stdout.write(self.style.ERROR(f"Failed to populate test data: {e}"))
|
||||
scan_state = "failed"
|
||||
|
||||
@@ -25,8 +25,4 @@ class Migration(migrations.Migration):
|
||||
default="aws",
|
||||
),
|
||||
),
|
||||
migrations.RunSQL(
|
||||
"ALTER TYPE provider ADD VALUE IF NOT EXISTS 'm365';",
|
||||
reverse_sql=migrations.RunSQL.noop,
|
||||
),
|
||||
]
|
||||
@@ -1,81 +0,0 @@
|
||||
# Generated by Django 5.1.7 on 2025-05-05 10:01
|
||||
|
||||
import uuid
|
||||
|
||||
import django.db.models.deletion
|
||||
import uuid6
|
||||
from django.db import migrations, models
|
||||
|
||||
import api.rls
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("api", "0017_m365_provider"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name="ResourceScanSummary",
|
||||
fields=[
|
||||
(
|
||||
"id",
|
||||
models.BigAutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
("scan_id", models.UUIDField(db_index=True, default=uuid6.uuid7)),
|
||||
("resource_id", models.UUIDField(db_index=True, default=uuid.uuid4)),
|
||||
("service", models.CharField(max_length=100)),
|
||||
("region", models.CharField(max_length=100)),
|
||||
("resource_type", models.CharField(max_length=100)),
|
||||
(
|
||||
"tenant",
|
||||
models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
|
||||
),
|
||||
),
|
||||
],
|
||||
options={
|
||||
"db_table": "resource_scan_summaries",
|
||||
"indexes": [
|
||||
models.Index(
|
||||
fields=["tenant_id", "scan_id", "service"],
|
||||
name="rss_tenant_scan_svc_idx",
|
||||
),
|
||||
models.Index(
|
||||
fields=["tenant_id", "scan_id", "region"],
|
||||
name="rss_tenant_scan_reg_idx",
|
||||
),
|
||||
models.Index(
|
||||
fields=["tenant_id", "scan_id", "resource_type"],
|
||||
name="rss_tenant_scan_type_idx",
|
||||
),
|
||||
models.Index(
|
||||
fields=["tenant_id", "scan_id", "region", "service"],
|
||||
name="rss_tenant_scan_reg_svc_idx",
|
||||
),
|
||||
models.Index(
|
||||
fields=["tenant_id", "scan_id", "service", "resource_type"],
|
||||
name="rss_tenant_scan_svc_type_idx",
|
||||
),
|
||||
models.Index(
|
||||
fields=["tenant_id", "scan_id", "region", "resource_type"],
|
||||
name="rss_tenant_scan_reg_type_idx",
|
||||
),
|
||||
],
|
||||
"unique_together": {("tenant_id", "scan_id", "resource_id")},
|
||||
},
|
||||
),
|
||||
migrations.AddConstraint(
|
||||
model_name="resourcescansummary",
|
||||
constraint=api.rls.RowLevelSecurityConstraint(
|
||||
"tenant_id",
|
||||
name="rls_on_resourcescansummary",
|
||||
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
|
||||
),
|
||||
),
|
||||
]
|
||||
@@ -1,42 +0,0 @@
|
||||
import django.contrib.postgres.fields
|
||||
import django.contrib.postgres.indexes
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("api", "0018_resource_scan_summaries"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name="finding",
|
||||
name="resource_regions",
|
||||
field=django.contrib.postgres.fields.ArrayField(
|
||||
base_field=models.CharField(max_length=100),
|
||||
blank=True,
|
||||
null=True,
|
||||
size=None,
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="finding",
|
||||
name="resource_services",
|
||||
field=django.contrib.postgres.fields.ArrayField(
|
||||
base_field=models.CharField(max_length=100),
|
||||
blank=True,
|
||||
null=True,
|
||||
size=None,
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="finding",
|
||||
name="resource_types",
|
||||
field=django.contrib.postgres.fields.ArrayField(
|
||||
base_field=models.CharField(max_length=100),
|
||||
blank=True,
|
||||
null=True,
|
||||
size=None,
|
||||
),
|
||||
),
|
||||
]
|
||||
@@ -1,86 +0,0 @@
|
||||
from functools import partial
|
||||
|
||||
from django.db import migrations
|
||||
|
||||
from api.db_utils import create_index_on_partitions, drop_index_on_partitions
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
atomic = False
|
||||
|
||||
dependencies = [
|
||||
("api", "0019_finding_denormalize_resource_fields"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(
|
||||
partial(
|
||||
create_index_on_partitions,
|
||||
parent_table="findings",
|
||||
index_name="gin_find_service_idx",
|
||||
columns="resource_services",
|
||||
method="GIN",
|
||||
),
|
||||
reverse_code=partial(
|
||||
drop_index_on_partitions,
|
||||
parent_table="findings",
|
||||
index_name="gin_find_service_idx",
|
||||
),
|
||||
),
|
||||
migrations.RunPython(
|
||||
partial(
|
||||
create_index_on_partitions,
|
||||
parent_table="findings",
|
||||
index_name="gin_find_region_idx",
|
||||
columns="resource_regions",
|
||||
method="GIN",
|
||||
),
|
||||
reverse_code=partial(
|
||||
drop_index_on_partitions,
|
||||
parent_table="findings",
|
||||
index_name="gin_find_region_idx",
|
||||
),
|
||||
),
|
||||
migrations.RunPython(
|
||||
partial(
|
||||
create_index_on_partitions,
|
||||
parent_table="findings",
|
||||
index_name="gin_find_rtype_idx",
|
||||
columns="resource_types",
|
||||
method="GIN",
|
||||
),
|
||||
reverse_code=partial(
|
||||
drop_index_on_partitions,
|
||||
parent_table="findings",
|
||||
index_name="gin_find_rtype_idx",
|
||||
),
|
||||
),
|
||||
migrations.RunPython(
|
||||
partial(
|
||||
drop_index_on_partitions,
|
||||
parent_table="findings",
|
||||
index_name="findings_uid_idx",
|
||||
),
|
||||
reverse_code=partial(
|
||||
create_index_on_partitions,
|
||||
parent_table="findings",
|
||||
index_name="findings_uid_idx",
|
||||
columns="uid",
|
||||
method="BTREE",
|
||||
),
|
||||
),
|
||||
migrations.RunPython(
|
||||
partial(
|
||||
drop_index_on_partitions,
|
||||
parent_table="findings",
|
||||
index_name="findings_filter_idx",
|
||||
),
|
||||
reverse_code=partial(
|
||||
create_index_on_partitions,
|
||||
parent_table="findings",
|
||||
index_name="findings_filter_idx",
|
||||
columns="scan_id, impact, severity, status, check_id, delta",
|
||||
method="BTREE",
|
||||
),
|
||||
),
|
||||
]
|
||||
@@ -1,37 +0,0 @@
|
||||
import django.contrib.postgres.indexes
|
||||
from django.db import migrations
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("api", "0020_findings_new_performance_indexes_partitions"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddIndex(
|
||||
model_name="finding",
|
||||
index=django.contrib.postgres.indexes.GinIndex(
|
||||
fields=["resource_services"], name="gin_find_service_idx"
|
||||
),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name="finding",
|
||||
index=django.contrib.postgres.indexes.GinIndex(
|
||||
fields=["resource_regions"], name="gin_find_region_idx"
|
||||
),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name="finding",
|
||||
index=django.contrib.postgres.indexes.GinIndex(
|
||||
fields=["resource_types"], name="gin_find_rtype_idx"
|
||||
),
|
||||
),
|
||||
migrations.RemoveIndex(
|
||||
model_name="finding",
|
||||
name="findings_uid_idx",
|
||||
),
|
||||
migrations.RemoveIndex(
|
||||
model_name="finding",
|
||||
name="findings_filter_idx",
|
||||
),
|
||||
]
|
||||
@@ -1,38 +0,0 @@
|
||||
# Generated by Django 5.1.8 on 2025-05-12 10:04
|
||||
|
||||
from django.contrib.postgres.operations import AddIndexConcurrently
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
atomic = False
|
||||
|
||||
dependencies = [
|
||||
("api", "0021_findings_new_performance_indexes_parent"),
|
||||
("django_celery_beat", "0019_alter_periodictasks_options"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
AddIndexConcurrently(
|
||||
model_name="scan",
|
||||
index=models.Index(
|
||||
condition=models.Q(("state", "completed")),
|
||||
fields=["tenant_id", "provider_id", "state", "-inserted_at"],
|
||||
name="scans_prov_state_ins_desc_idx",
|
||||
),
|
||||
),
|
||||
AddIndexConcurrently(
|
||||
model_name="scansummary",
|
||||
index=models.Index(
|
||||
fields=["tenant_id", "scan_id", "service"],
|
||||
name="ss_tenant_scan_service_idx",
|
||||
),
|
||||
),
|
||||
AddIndexConcurrently(
|
||||
model_name="scansummary",
|
||||
index=models.Index(
|
||||
fields=["tenant_id", "scan_id", "severity"],
|
||||
name="ss_tenant_scan_severity_idx",
|
||||
),
|
||||
),
|
||||
]
|
||||
@@ -1,28 +0,0 @@
|
||||
# Generated by Django 5.1.8 on 2025-05-12 10:18
|
||||
|
||||
from django.contrib.postgres.operations import AddIndexConcurrently
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
atomic = False
|
||||
|
||||
dependencies = [
|
||||
("api", "0022_scan_summaries_performance_indexes"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
AddIndexConcurrently(
|
||||
model_name="resource",
|
||||
index=models.Index(
|
||||
fields=["tenant_id", "id"], name="resources_tenant_id_idx"
|
||||
),
|
||||
),
|
||||
AddIndexConcurrently(
|
||||
model_name="resource",
|
||||
index=models.Index(
|
||||
fields=["tenant_id", "provider_id"],
|
||||
name="resources_tenant_provider_idx",
|
||||
),
|
||||
),
|
||||
]
|
||||
@@ -1,29 +0,0 @@
|
||||
from functools import partial
|
||||
|
||||
from django.db import migrations
|
||||
|
||||
from api.db_utils import create_index_on_partitions, drop_index_on_partitions
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
atomic = False
|
||||
|
||||
dependencies = [
|
||||
("api", "0023_resources_lookup_optimization"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(
|
||||
partial(
|
||||
create_index_on_partitions,
|
||||
parent_table="findings",
|
||||
index_name="find_tenant_uid_inserted_idx",
|
||||
columns="tenant_id, uid, inserted_at DESC",
|
||||
),
|
||||
reverse_code=partial(
|
||||
drop_index_on_partitions,
|
||||
parent_table="findings",
|
||||
index_name="find_tenant_uid_inserted_idx",
|
||||
),
|
||||
)
|
||||
]
|
||||
@@ -1,17 +0,0 @@
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("api", "0024_findings_uid_index_partitions"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddIndex(
|
||||
model_name="finding",
|
||||
index=models.Index(
|
||||
fields=["tenant_id", "uid", "-inserted_at"],
|
||||
name="find_tenant_uid_inserted_idx",
|
||||
),
|
||||
),
|
||||
]
|
||||
@@ -5,7 +5,6 @@ from uuid import UUID, uuid4
|
||||
from cryptography.fernet import Fernet
|
||||
from django.conf import settings
|
||||
from django.contrib.auth.models import AbstractBaseUser
|
||||
from django.contrib.postgres.fields import ArrayField
|
||||
from django.contrib.postgres.indexes import GinIndex
|
||||
from django.contrib.postgres.search import SearchVector, SearchVectorField
|
||||
from django.core.validators import MinLengthValidator
|
||||
@@ -218,13 +217,9 @@ class Provider(RowLevelSecurityProtectedModel):
|
||||
|
||||
@staticmethod
|
||||
def validate_m365_uid(value):
|
||||
if not re.match(
|
||||
r"""^(?!-)[A-Za-z0-9](?:[A-Za-z0-9-]{0,61}[A-Za-z0-9])?(?:\.(?!-)[A-Za-z0-9]"""
|
||||
r"""(?:[A-Za-z0-9-]{0,61}[A-Za-z0-9])?)*\.[A-Za-z]{2,}$""",
|
||||
value,
|
||||
):
|
||||
if not re.match(r"^[a-zA-Z0-9-]+\.onmicrosoft\.com$", value):
|
||||
raise ModelValidationError(
|
||||
detail="M365 domain ID must be a valid domain.",
|
||||
detail="M365 tenant ID must be a valid domain.",
|
||||
code="m365-uid",
|
||||
pointer="/data/attributes/uid",
|
||||
)
|
||||
@@ -430,7 +425,6 @@ class Scan(RowLevelSecurityProtectedModel):
|
||||
PeriodicTask, on_delete=models.CASCADE, null=True, blank=True
|
||||
)
|
||||
output_location = models.CharField(blank=True, null=True, max_length=200)
|
||||
|
||||
# TODO: mutelist foreign key
|
||||
|
||||
class Meta(RowLevelSecurityProtectedModel.Meta):
|
||||
@@ -453,11 +447,6 @@ class Scan(RowLevelSecurityProtectedModel):
|
||||
fields=["tenant_id", "provider_id", "state", "inserted_at"],
|
||||
name="scans_prov_state_insert_idx",
|
||||
),
|
||||
models.Index(
|
||||
fields=["tenant_id", "provider_id", "state", "-inserted_at"],
|
||||
condition=Q(state=StateChoices.COMPLETED),
|
||||
name="scans_prov_state_ins_desc_idx",
|
||||
),
|
||||
]
|
||||
|
||||
class JSONAPIMeta:
|
||||
@@ -584,11 +573,6 @@ class Resource(RowLevelSecurityProtectedModel):
|
||||
name="resource_tenant_metadata_idx",
|
||||
),
|
||||
GinIndex(fields=["text_search"], name="gin_resources_search_idx"),
|
||||
models.Index(fields=["tenant_id", "id"], name="resources_tenant_id_idx"),
|
||||
models.Index(
|
||||
fields=["tenant_id", "provider_id"],
|
||||
name="resources_tenant_provider_idx",
|
||||
),
|
||||
]
|
||||
|
||||
constraints = [
|
||||
@@ -689,21 +673,6 @@ class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
|
||||
muted = models.BooleanField(default=False, null=False)
|
||||
compliance = models.JSONField(default=dict, null=True, blank=True)
|
||||
|
||||
# Denormalize resource data for performance
|
||||
resource_regions = ArrayField(
|
||||
models.CharField(max_length=100), blank=True, null=True
|
||||
)
|
||||
resource_services = ArrayField(
|
||||
models.CharField(max_length=100),
|
||||
blank=True,
|
||||
null=True,
|
||||
)
|
||||
resource_types = ArrayField(
|
||||
models.CharField(max_length=100),
|
||||
blank=True,
|
||||
null=True,
|
||||
)
|
||||
|
||||
# Relationships
|
||||
scan = models.ForeignKey(to=Scan, related_name="findings", on_delete=models.CASCADE)
|
||||
|
||||
@@ -744,6 +713,18 @@ class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
|
||||
]
|
||||
|
||||
indexes = [
|
||||
models.Index(fields=["uid"], name="findings_uid_idx"),
|
||||
models.Index(
|
||||
fields=[
|
||||
"scan_id",
|
||||
"impact",
|
||||
"severity",
|
||||
"status",
|
||||
"check_id",
|
||||
"delta",
|
||||
],
|
||||
name="findings_filter_idx",
|
||||
),
|
||||
models.Index(fields=["tenant_id", "id"], name="findings_tenant_and_id_idx"),
|
||||
GinIndex(fields=["text_search"], name="gin_findings_search_idx"),
|
||||
models.Index(fields=["tenant_id", "scan_id"], name="find_tenant_scan_idx"),
|
||||
@@ -755,42 +736,19 @@ class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
|
||||
condition=Q(delta="new"),
|
||||
name="find_delta_new_idx",
|
||||
),
|
||||
models.Index(
|
||||
fields=["tenant_id", "uid", "-inserted_at"],
|
||||
name="find_tenant_uid_inserted_idx",
|
||||
),
|
||||
GinIndex(fields=["resource_services"], name="gin_find_service_idx"),
|
||||
GinIndex(fields=["resource_regions"], name="gin_find_region_idx"),
|
||||
GinIndex(fields=["resource_types"], name="gin_find_rtype_idx"),
|
||||
]
|
||||
|
||||
class JSONAPIMeta:
|
||||
resource_name = "findings"
|
||||
|
||||
def add_resources(self, resources: list[Resource] | None):
|
||||
if not resources:
|
||||
return
|
||||
|
||||
self.resource_regions = self.resource_regions or []
|
||||
self.resource_services = self.resource_services or []
|
||||
self.resource_types = self.resource_types or []
|
||||
|
||||
# Deduplication
|
||||
regions = set(self.resource_regions)
|
||||
services = set(self.resource_services)
|
||||
types = set(self.resource_types)
|
||||
|
||||
# Add new relationships with the tenant_id field
|
||||
for resource in resources:
|
||||
ResourceFindingMapping.objects.update_or_create(
|
||||
resource=resource, finding=self, tenant_id=self.tenant_id
|
||||
)
|
||||
regions.add(resource.region)
|
||||
services.add(resource.service)
|
||||
types.add(resource.type)
|
||||
|
||||
self.resource_regions = list(regions)
|
||||
self.resource_services = list(services)
|
||||
self.resource_types = list(types)
|
||||
# Save the instance
|
||||
self.save()
|
||||
|
||||
|
||||
@@ -1192,15 +1150,7 @@ class ScanSummary(RowLevelSecurityProtectedModel):
|
||||
models.Index(
|
||||
fields=["tenant_id", "scan_id"],
|
||||
name="scan_summaries_tenant_scan_idx",
|
||||
),
|
||||
models.Index(
|
||||
fields=["tenant_id", "scan_id", "service"],
|
||||
name="ss_tenant_scan_service_idx",
|
||||
),
|
||||
models.Index(
|
||||
fields=["tenant_id", "scan_id", "severity"],
|
||||
name="ss_tenant_scan_severity_idx",
|
||||
),
|
||||
)
|
||||
]
|
||||
|
||||
class JSONAPIMeta:
|
||||
@@ -1282,52 +1232,3 @@ class IntegrationProviderRelationship(RowLevelSecurityProtectedModel):
|
||||
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
|
||||
),
|
||||
]
|
||||
|
||||
|
||||
class ResourceScanSummary(RowLevelSecurityProtectedModel):
|
||||
scan_id = models.UUIDField(default=uuid7, db_index=True)
|
||||
resource_id = models.UUIDField(default=uuid4, db_index=True)
|
||||
service = models.CharField(max_length=100)
|
||||
region = models.CharField(max_length=100)
|
||||
resource_type = models.CharField(max_length=100)
|
||||
|
||||
class Meta:
|
||||
db_table = "resource_scan_summaries"
|
||||
unique_together = (("tenant_id", "scan_id", "resource_id"),)
|
||||
|
||||
indexes = [
|
||||
# Single-dimension lookups:
|
||||
models.Index(
|
||||
fields=["tenant_id", "scan_id", "service"],
|
||||
name="rss_tenant_scan_svc_idx",
|
||||
),
|
||||
models.Index(
|
||||
fields=["tenant_id", "scan_id", "region"],
|
||||
name="rss_tenant_scan_reg_idx",
|
||||
),
|
||||
models.Index(
|
||||
fields=["tenant_id", "scan_id", "resource_type"],
|
||||
name="rss_tenant_scan_type_idx",
|
||||
),
|
||||
# Two-dimension cross-filters:
|
||||
models.Index(
|
||||
fields=["tenant_id", "scan_id", "region", "service"],
|
||||
name="rss_tenant_scan_reg_svc_idx",
|
||||
),
|
||||
models.Index(
|
||||
fields=["tenant_id", "scan_id", "service", "resource_type"],
|
||||
name="rss_tenant_scan_svc_type_idx",
|
||||
),
|
||||
models.Index(
|
||||
fields=["tenant_id", "scan_id", "region", "resource_type"],
|
||||
name="rss_tenant_scan_reg_type_idx",
|
||||
),
|
||||
]
|
||||
|
||||
constraints = [
|
||||
RowLevelSecurityConstraint(
|
||||
field="tenant_id",
|
||||
name="rls_on_%(class)s",
|
||||
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
openapi: 3.0.3
|
||||
info:
|
||||
title: Prowler API
|
||||
version: 1.8.1
|
||||
version: 1.6.0
|
||||
description: |-
|
||||
Prowler API specification.
|
||||
|
||||
@@ -1238,416 +1238,6 @@ paths:
|
||||
schema:
|
||||
$ref: '#/components/schemas/FindingDynamicFilterResponse'
|
||||
description: ''
|
||||
/api/v1/findings/latest:
|
||||
get:
|
||||
operationId: findings_latest_retrieve
|
||||
description: Retrieve a list of the latest findings from the latest scans for
|
||||
each provider with options for filtering by various criteria.
|
||||
summary: List the latest findings
|
||||
parameters:
|
||||
- in: query
|
||||
name: fields[findings]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
enum:
|
||||
- uid
|
||||
- delta
|
||||
- status
|
||||
- status_extended
|
||||
- severity
|
||||
- check_id
|
||||
- check_metadata
|
||||
- raw_result
|
||||
- inserted_at
|
||||
- updated_at
|
||||
- first_seen_at
|
||||
- muted
|
||||
- url
|
||||
- scan
|
||||
- resources
|
||||
description: endpoint return only specific fields in the response on a per-type
|
||||
basis by including a fields[TYPE] query parameter.
|
||||
explode: false
|
||||
- in: query
|
||||
name: filter[check_id]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[check_id__icontains]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[check_id__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[delta]
|
||||
schema:
|
||||
type: string
|
||||
nullable: true
|
||||
enum:
|
||||
- changed
|
||||
- new
|
||||
description: |-
|
||||
* `new` - New
|
||||
* `changed` - Changed
|
||||
- in: query
|
||||
name: filter[delta__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[id]
|
||||
schema:
|
||||
type: string
|
||||
format: uuid
|
||||
- in: query
|
||||
name: filter[id__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
format: uuid
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[impact]
|
||||
schema:
|
||||
type: string
|
||||
enum:
|
||||
- critical
|
||||
- high
|
||||
- informational
|
||||
- low
|
||||
- medium
|
||||
description: |-
|
||||
* `critical` - Critical
|
||||
* `high` - High
|
||||
* `medium` - Medium
|
||||
* `low` - Low
|
||||
* `informational` - Informational
|
||||
- in: query
|
||||
name: filter[impact__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[muted]
|
||||
schema:
|
||||
type: boolean
|
||||
description: If this filter is not provided, muted and non-muted findings
|
||||
will be returned.
|
||||
- in: query
|
||||
name: filter[provider]
|
||||
schema:
|
||||
type: string
|
||||
format: uuid
|
||||
- in: query
|
||||
name: filter[provider__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
format: uuid
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[provider_alias]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[provider_alias__icontains]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[provider_alias__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[provider_type]
|
||||
schema:
|
||||
type: string
|
||||
enum:
|
||||
- aws
|
||||
- azure
|
||||
- gcp
|
||||
- kubernetes
|
||||
- m365
|
||||
description: |-
|
||||
* `aws` - AWS
|
||||
* `azure` - Azure
|
||||
* `gcp` - GCP
|
||||
* `kubernetes` - Kubernetes
|
||||
* `m365` - M365
|
||||
- in: query
|
||||
name: filter[provider_type__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
enum:
|
||||
- aws
|
||||
- azure
|
||||
- gcp
|
||||
- kubernetes
|
||||
- m365
|
||||
description: |-
|
||||
Multiple values may be separated by commas.
|
||||
|
||||
* `aws` - AWS
|
||||
* `azure` - Azure
|
||||
* `gcp` - GCP
|
||||
* `kubernetes` - Kubernetes
|
||||
* `m365` - M365
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[provider_uid]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[provider_uid__icontains]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[provider_uid__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[region]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[region__icontains]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[region__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[resource_name]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[resource_name__icontains]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[resource_name__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[resource_type]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[resource_type__icontains]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[resource_type__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[resource_uid]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[resource_uid__icontains]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[resource_uid__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[resources]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
format: uuid
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- name: filter[search]
|
||||
required: false
|
||||
in: query
|
||||
description: A search term.
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[service]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[service__icontains]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[service__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[severity]
|
||||
schema:
|
||||
type: string
|
||||
enum:
|
||||
- critical
|
||||
- high
|
||||
- informational
|
||||
- low
|
||||
- medium
|
||||
description: |-
|
||||
* `critical` - Critical
|
||||
* `high` - High
|
||||
* `medium` - Medium
|
||||
* `low` - Low
|
||||
* `informational` - Informational
|
||||
- in: query
|
||||
name: filter[severity__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[status]
|
||||
schema:
|
||||
type: string
|
||||
enum:
|
||||
- FAIL
|
||||
- MANUAL
|
||||
- PASS
|
||||
description: |-
|
||||
* `FAIL` - Fail
|
||||
* `PASS` - Pass
|
||||
* `MANUAL` - Manual
|
||||
- in: query
|
||||
name: filter[status__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[uid]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[uid__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[updated_at]
|
||||
schema:
|
||||
type: string
|
||||
format: date
|
||||
- in: query
|
||||
name: include
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
enum:
|
||||
- scan
|
||||
- resources
|
||||
description: include query parameter to allow the client to customize which
|
||||
related resources should be returned.
|
||||
explode: false
|
||||
- name: sort
|
||||
required: false
|
||||
in: query
|
||||
description: '[list of fields to sort by](https://jsonapi.org/format/#fetching-sorting)'
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
enum:
|
||||
- status
|
||||
- -status
|
||||
- severity
|
||||
- -severity
|
||||
- check_id
|
||||
- -check_id
|
||||
- inserted_at
|
||||
- -inserted_at
|
||||
- updated_at
|
||||
- -updated_at
|
||||
explode: false
|
||||
tags:
|
||||
- Finding
|
||||
security:
|
||||
- jwtAuth: []
|
||||
responses:
|
||||
'200':
|
||||
content:
|
||||
application/vnd.api+json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/FindingResponse'
|
||||
description: ''
|
||||
/api/v1/findings/metadata:
|
||||
get:
|
||||
operationId: findings_metadata_retrieve
|
||||
@@ -2084,392 +1674,6 @@ paths:
|
||||
schema:
|
||||
$ref: '#/components/schemas/FindingMetadataResponse'
|
||||
description: ''
|
||||
/api/v1/findings/metadata/latest:
|
||||
get:
|
||||
operationId: findings_metadata_latest_retrieve
|
||||
description: Fetch unique metadata values from a set of findings from the latest
|
||||
scans for each provider. This is useful for dynamic filtering.
|
||||
summary: Retrieve metadata values from the latest findings
|
||||
parameters:
|
||||
- in: query
|
||||
name: fields[findings-metadata]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
enum:
|
||||
- services
|
||||
- regions
|
||||
- resource_types
|
||||
description: endpoint return only specific fields in the response on a per-type
|
||||
basis by including a fields[TYPE] query parameter.
|
||||
explode: false
|
||||
- in: query
|
||||
name: filter[check_id]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[check_id__icontains]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[check_id__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[delta]
|
||||
schema:
|
||||
type: string
|
||||
nullable: true
|
||||
enum:
|
||||
- changed
|
||||
- new
|
||||
description: |-
|
||||
* `new` - New
|
||||
* `changed` - Changed
|
||||
- in: query
|
||||
name: filter[delta__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[id]
|
||||
schema:
|
||||
type: string
|
||||
format: uuid
|
||||
- in: query
|
||||
name: filter[id__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
format: uuid
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[impact]
|
||||
schema:
|
||||
type: string
|
||||
enum:
|
||||
- critical
|
||||
- high
|
||||
- informational
|
||||
- low
|
||||
- medium
|
||||
description: |-
|
||||
* `critical` - Critical
|
||||
* `high` - High
|
||||
* `medium` - Medium
|
||||
* `low` - Low
|
||||
* `informational` - Informational
|
||||
- in: query
|
||||
name: filter[impact__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[muted]
|
||||
schema:
|
||||
type: boolean
|
||||
description: If this filter is not provided, muted and non-muted findings
|
||||
will be returned.
|
||||
- in: query
|
||||
name: filter[provider]
|
||||
schema:
|
||||
type: string
|
||||
format: uuid
|
||||
- in: query
|
||||
name: filter[provider__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
format: uuid
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[provider_alias]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[provider_alias__icontains]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[provider_alias__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[provider_type]
|
||||
schema:
|
||||
type: string
|
||||
enum:
|
||||
- aws
|
||||
- azure
|
||||
- gcp
|
||||
- kubernetes
|
||||
- m365
|
||||
description: |-
|
||||
* `aws` - AWS
|
||||
* `azure` - Azure
|
||||
* `gcp` - GCP
|
||||
* `kubernetes` - Kubernetes
|
||||
* `m365` - M365
|
||||
- in: query
|
||||
name: filter[provider_type__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
enum:
|
||||
- aws
|
||||
- azure
|
||||
- gcp
|
||||
- kubernetes
|
||||
- m365
|
||||
description: |-
|
||||
Multiple values may be separated by commas.
|
||||
|
||||
* `aws` - AWS
|
||||
* `azure` - Azure
|
||||
* `gcp` - GCP
|
||||
* `kubernetes` - Kubernetes
|
||||
* `m365` - M365
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[provider_uid]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[provider_uid__icontains]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[provider_uid__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[region]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[region__icontains]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[region__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[resource_name]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[resource_name__icontains]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[resource_name__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[resource_type]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[resource_type__icontains]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[resource_type__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[resource_uid]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[resource_uid__icontains]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[resource_uid__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[resources]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
format: uuid
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- name: filter[search]
|
||||
required: false
|
||||
in: query
|
||||
description: A search term.
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[service]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[service__icontains]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[service__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[severity]
|
||||
schema:
|
||||
type: string
|
||||
enum:
|
||||
- critical
|
||||
- high
|
||||
- informational
|
||||
- low
|
||||
- medium
|
||||
description: |-
|
||||
* `critical` - Critical
|
||||
* `high` - High
|
||||
* `medium` - Medium
|
||||
* `low` - Low
|
||||
* `informational` - Informational
|
||||
- in: query
|
||||
name: filter[severity__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[status]
|
||||
schema:
|
||||
type: string
|
||||
enum:
|
||||
- FAIL
|
||||
- MANUAL
|
||||
- PASS
|
||||
description: |-
|
||||
* `FAIL` - Fail
|
||||
* `PASS` - Pass
|
||||
* `MANUAL` - Manual
|
||||
- in: query
|
||||
name: filter[status__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[uid]
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: filter[uid__in]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: Multiple values may be separated by commas.
|
||||
explode: false
|
||||
style: form
|
||||
- in: query
|
||||
name: filter[updated_at]
|
||||
schema:
|
||||
type: string
|
||||
format: date
|
||||
- name: sort
|
||||
required: false
|
||||
in: query
|
||||
description: '[list of fields to sort by](https://jsonapi.org/format/#fetching-sorting)'
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
enum:
|
||||
- status
|
||||
- -status
|
||||
- severity
|
||||
- -severity
|
||||
- check_id
|
||||
- -check_id
|
||||
- inserted_at
|
||||
- -inserted_at
|
||||
- updated_at
|
||||
- -updated_at
|
||||
explode: false
|
||||
tags:
|
||||
- Finding
|
||||
security:
|
||||
- jwtAuth: []
|
||||
responses:
|
||||
'200':
|
||||
content:
|
||||
application/vnd.api+json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/FindingMetadataResponse'
|
||||
description: ''
|
||||
/api/v1/integrations:
|
||||
get:
|
||||
operationId: integrations_list
|
||||
@@ -5299,47 +4503,6 @@ paths:
|
||||
schema:
|
||||
$ref: '#/components/schemas/ScanUpdateResponse'
|
||||
description: ''
|
||||
/api/v1/scans/{id}/compliance/{name}:
|
||||
get:
|
||||
operationId: scans_compliance_retrieve
|
||||
description: Download a specific compliance report (e.g., 'cis_1.4_aws') as
|
||||
a CSV file.
|
||||
summary: Retrieve compliance report as CSV
|
||||
parameters:
|
||||
- in: query
|
||||
name: fields[scan-reports]
|
||||
schema:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
enum:
|
||||
- id
|
||||
- name
|
||||
description: endpoint return only specific fields in the response on a per-type
|
||||
basis by including a fields[TYPE] query parameter.
|
||||
explode: false
|
||||
- in: path
|
||||
name: id
|
||||
schema:
|
||||
type: string
|
||||
format: uuid
|
||||
description: A UUID string identifying this scan.
|
||||
required: true
|
||||
- in: path
|
||||
name: name
|
||||
schema:
|
||||
type: string
|
||||
description: The compliance report name, like 'cis_1.4_aws'
|
||||
required: true
|
||||
tags:
|
||||
- Scan
|
||||
security:
|
||||
- jwtAuth: []
|
||||
responses:
|
||||
'200':
|
||||
description: CSV file containing the compliance report
|
||||
'404':
|
||||
description: Compliance report not found
|
||||
/api/v1/scans/{id}/report:
|
||||
get:
|
||||
operationId: scans_report_retrieve
|
||||
|
||||
@@ -2,20 +2,17 @@ import glob
|
||||
import io
|
||||
import json
|
||||
import os
|
||||
import tempfile
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from pathlib import Path
|
||||
from unittest.mock import ANY, MagicMock, Mock, patch
|
||||
from unittest.mock import ANY, Mock, patch
|
||||
|
||||
import jwt
|
||||
import pytest
|
||||
from botocore.exceptions import ClientError, NoCredentialsError
|
||||
from botocore.exceptions import NoCredentialsError
|
||||
from conftest import API_JSON_CONTENT_TYPE, TEST_PASSWORD, TEST_USER
|
||||
from django.conf import settings
|
||||
from django.urls import reverse
|
||||
from rest_framework import status
|
||||
|
||||
from api.compliance import get_compliance_frameworks
|
||||
from api.models import (
|
||||
ComplianceOverview,
|
||||
Integration,
|
||||
@@ -918,26 +915,6 @@ class TestProviderViewSet:
|
||||
"uid": "8851db6b-42e5-4533-aa9e-30a32d67e875",
|
||||
"alias": "test",
|
||||
},
|
||||
{
|
||||
"provider": "m365",
|
||||
"uid": "TestingPro.onMirosoft.com",
|
||||
"alias": "test",
|
||||
},
|
||||
{
|
||||
"provider": "m365",
|
||||
"uid": "subdomain.domain.es",
|
||||
"alias": "test",
|
||||
},
|
||||
{
|
||||
"provider": "m365",
|
||||
"uid": "microsoft.net",
|
||||
"alias": "test",
|
||||
},
|
||||
{
|
||||
"provider": "m365",
|
||||
"uid": "subdomain1.subdomain2.subdomain3.subdomain4.domain.net",
|
||||
"alias": "test",
|
||||
},
|
||||
]
|
||||
),
|
||||
)
|
||||
@@ -1006,51 +983,6 @@ class TestProviderViewSet:
|
||||
"invalid_choice",
|
||||
"provider",
|
||||
),
|
||||
(
|
||||
{
|
||||
"provider": "m365",
|
||||
"uid": "https://test.com",
|
||||
"alias": "test",
|
||||
},
|
||||
"m365-uid",
|
||||
"uid",
|
||||
),
|
||||
(
|
||||
{
|
||||
"provider": "m365",
|
||||
"uid": "thisisnotadomain",
|
||||
"alias": "test",
|
||||
},
|
||||
"m365-uid",
|
||||
"uid",
|
||||
),
|
||||
(
|
||||
{
|
||||
"provider": "m365",
|
||||
"uid": "http://test.com",
|
||||
"alias": "test",
|
||||
},
|
||||
"m365-uid",
|
||||
"uid",
|
||||
),
|
||||
(
|
||||
{
|
||||
"provider": "m365",
|
||||
"uid": f"{'a' * 64}.domain.com",
|
||||
"alias": "test",
|
||||
},
|
||||
"m365-uid",
|
||||
"uid",
|
||||
),
|
||||
(
|
||||
{
|
||||
"provider": "m365",
|
||||
"uid": f"subdomain.{'a' * 64}.com",
|
||||
"alias": "test",
|
||||
},
|
||||
"m365-uid",
|
||||
"uid",
|
||||
),
|
||||
]
|
||||
),
|
||||
)
|
||||
@@ -2345,8 +2277,7 @@ class TestScanViewSet:
|
||||
scan.save()
|
||||
|
||||
monkeypatch.setattr(
|
||||
"api.v1.views.env",
|
||||
type("env", (), {"str": lambda self, *args, **kwargs: "test-bucket"})(),
|
||||
"api.v1.views.env", type("env", (), {"str": lambda self, key: bucket})()
|
||||
)
|
||||
|
||||
class FakeS3Client:
|
||||
@@ -2385,296 +2316,35 @@ class TestScanViewSet:
|
||||
assert response.status_code == 404
|
||||
assert response.json()["errors"]["detail"] == "The scan has no reports."
|
||||
|
||||
def test_report_local_file(self, authenticated_client, scans_fixture, monkeypatch):
|
||||
scan = scans_fixture[0]
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
tmp_path = Path(tmp)
|
||||
base_tmp = tmp_path / "report_local_file"
|
||||
base_tmp.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
file_content = b"local zip file content"
|
||||
file_path = base_tmp / "report.zip"
|
||||
file_path.write_bytes(file_content)
|
||||
|
||||
scan.output_location = str(file_path)
|
||||
scan.state = StateChoices.COMPLETED
|
||||
scan.save()
|
||||
|
||||
monkeypatch.setattr(
|
||||
glob,
|
||||
"glob",
|
||||
lambda pattern: [str(file_path)] if pattern == str(file_path) else [],
|
||||
)
|
||||
|
||||
url = reverse("scan-report", kwargs={"pk": scan.id})
|
||||
response = authenticated_client.get(url)
|
||||
assert response.status_code == 200
|
||||
assert response.content == file_content
|
||||
content_disposition = response.get("Content-Disposition")
|
||||
assert content_disposition.startswith('attachment; filename="')
|
||||
assert f'filename="{file_path.name}"' in content_disposition
|
||||
|
||||
def test_compliance_invalid_framework(self, authenticated_client, scans_fixture):
|
||||
scan = scans_fixture[0]
|
||||
scan.state = StateChoices.COMPLETED
|
||||
scan.output_location = "dummy"
|
||||
scan.save()
|
||||
|
||||
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": "invalid"})
|
||||
resp = authenticated_client.get(url)
|
||||
assert resp.status_code == status.HTTP_404_NOT_FOUND
|
||||
assert resp.json()["errors"]["detail"] == "Compliance 'invalid' not found."
|
||||
|
||||
def test_compliance_executing(
|
||||
self, authenticated_client, scans_fixture, monkeypatch
|
||||
def test_report_local_file(
|
||||
self, authenticated_client, scans_fixture, tmp_path, monkeypatch
|
||||
):
|
||||
"""
|
||||
When output_location is a local file path, the view should read the file from disk
|
||||
and return it with proper headers.
|
||||
"""
|
||||
scan = scans_fixture[0]
|
||||
scan.state = StateChoices.EXECUTING
|
||||
scan.save()
|
||||
task = Task.objects.create(tenant_id=scan.tenant_id)
|
||||
scan.task = task
|
||||
scan.save()
|
||||
dummy = {"id": str(task.id), "state": StateChoices.EXECUTING}
|
||||
file_content = b"local zip file content"
|
||||
file_path = tmp_path / "report.zip"
|
||||
file_path.write_bytes(file_content)
|
||||
|
||||
monkeypatch.setattr(
|
||||
"api.v1.views.TaskSerializer",
|
||||
lambda *args, **kwargs: type("S", (), {"data": dummy}),
|
||||
)
|
||||
|
||||
framework = get_compliance_frameworks(scan.provider.provider)[0]
|
||||
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": framework})
|
||||
resp = authenticated_client.get(url)
|
||||
assert resp.status_code == status.HTTP_202_ACCEPTED
|
||||
assert "Content-Location" in resp
|
||||
assert dummy["id"] in resp["Content-Location"]
|
||||
|
||||
def test_compliance_no_output(self, authenticated_client, scans_fixture):
|
||||
scan = scans_fixture[0]
|
||||
scan.state = StateChoices.COMPLETED
|
||||
scan.output_location = ""
|
||||
scan.save()
|
||||
|
||||
framework = get_compliance_frameworks(scan.provider.provider)[0]
|
||||
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": framework})
|
||||
resp = authenticated_client.get(url)
|
||||
assert resp.status_code == status.HTTP_404_NOT_FOUND
|
||||
assert resp.json()["errors"]["detail"] == "The scan has no reports."
|
||||
|
||||
def test_compliance_s3_no_credentials(
|
||||
self, authenticated_client, scans_fixture, monkeypatch
|
||||
):
|
||||
scan = scans_fixture[0]
|
||||
bucket = "bucket"
|
||||
key = "file.zip"
|
||||
scan.output_location = f"s3://{bucket}/{key}"
|
||||
scan.output_location = str(file_path)
|
||||
scan.state = StateChoices.COMPLETED
|
||||
scan.save()
|
||||
|
||||
monkeypatch.setattr(
|
||||
"api.v1.views.get_s3_client",
|
||||
lambda: (_ for _ in ()).throw(NoCredentialsError()),
|
||||
glob,
|
||||
"glob",
|
||||
lambda pattern: [str(file_path)] if pattern == str(file_path) else [],
|
||||
)
|
||||
|
||||
framework = get_compliance_frameworks(scan.provider.provider)[0]
|
||||
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": framework})
|
||||
resp = authenticated_client.get(url)
|
||||
assert resp.status_code == status.HTTP_403_FORBIDDEN
|
||||
assert resp.json()["errors"]["detail"] == "There is a problem with credentials."
|
||||
|
||||
def test_compliance_s3_success(
|
||||
self, authenticated_client, scans_fixture, monkeypatch
|
||||
):
|
||||
scan = scans_fixture[0]
|
||||
bucket = "bucket"
|
||||
prefix = "path/scan.zip"
|
||||
scan.output_location = f"s3://{bucket}/{prefix}"
|
||||
scan.state = StateChoices.COMPLETED
|
||||
scan.save()
|
||||
|
||||
monkeypatch.setattr(
|
||||
"api.v1.views.env",
|
||||
type("env", (), {"str": lambda self, *args, **kwargs: "test-bucket"})(),
|
||||
)
|
||||
|
||||
match_key = "path/compliance/mitre_attack_aws.csv"
|
||||
|
||||
class FakeS3Client:
|
||||
def list_objects_v2(self, Bucket, Prefix):
|
||||
return {"Contents": [{"Key": match_key}]}
|
||||
|
||||
def get_object(self, Bucket, Key):
|
||||
return {"Body": io.BytesIO(b"ignored")}
|
||||
|
||||
monkeypatch.setattr("api.v1.views.get_s3_client", lambda: FakeS3Client())
|
||||
|
||||
framework = match_key.split("/")[-1].split(".")[0]
|
||||
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": framework})
|
||||
resp = authenticated_client.get(url)
|
||||
assert resp.status_code == status.HTTP_200_OK
|
||||
cd = resp["Content-Disposition"]
|
||||
assert cd.startswith('attachment; filename="')
|
||||
assert cd.endswith('filename="mitre_attack_aws.csv"')
|
||||
|
||||
def test_compliance_s3_not_found(
|
||||
self, authenticated_client, scans_fixture, monkeypatch
|
||||
):
|
||||
scan = scans_fixture[0]
|
||||
bucket = "bucket"
|
||||
scan.output_location = f"s3://{bucket}/x/scan.zip"
|
||||
scan.state = StateChoices.COMPLETED
|
||||
scan.save()
|
||||
|
||||
monkeypatch.setattr(
|
||||
"api.v1.views.env",
|
||||
type("env", (), {"str": lambda self, *args, **kwargs: "test-bucket"})(),
|
||||
)
|
||||
|
||||
class FakeS3Client:
|
||||
def list_objects_v2(self, Bucket, Prefix):
|
||||
return {"Contents": []}
|
||||
|
||||
def get_object(self, Bucket, Key):
|
||||
return {"Body": io.BytesIO(b"ignored")}
|
||||
|
||||
monkeypatch.setattr("api.v1.views.get_s3_client", lambda: FakeS3Client())
|
||||
|
||||
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": "cis_1.4_aws"})
|
||||
resp = authenticated_client.get(url)
|
||||
assert resp.status_code == status.HTTP_404_NOT_FOUND
|
||||
assert (
|
||||
resp.json()["errors"]["detail"]
|
||||
== "No compliance file found for name 'cis_1.4_aws'."
|
||||
)
|
||||
|
||||
def test_compliance_local_file(
|
||||
self, authenticated_client, scans_fixture, monkeypatch
|
||||
):
|
||||
scan = scans_fixture[0]
|
||||
scan.state = StateChoices.COMPLETED
|
||||
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
tmp_path = Path(tmp)
|
||||
base = tmp_path / "reports"
|
||||
comp_dir = base / "compliance"
|
||||
comp_dir.mkdir(parents=True, exist_ok=True)
|
||||
fname = comp_dir / "scan_cis.csv"
|
||||
fname.write_bytes(b"ignored")
|
||||
|
||||
scan.output_location = str(base / "scan.zip")
|
||||
scan.save()
|
||||
|
||||
monkeypatch.setattr(
|
||||
glob,
|
||||
"glob",
|
||||
lambda p: [str(fname)] if p.endswith("*_cis_1.4_aws.csv") else [],
|
||||
)
|
||||
|
||||
url = reverse(
|
||||
"scan-compliance", kwargs={"pk": scan.id, "name": "cis_1.4_aws"}
|
||||
)
|
||||
resp = authenticated_client.get(url)
|
||||
assert resp.status_code == status.HTTP_200_OK
|
||||
cd = resp["Content-Disposition"]
|
||||
assert cd.startswith('attachment; filename="')
|
||||
assert cd.endswith(f'filename="{fname.name}"')
|
||||
|
||||
@patch("api.v1.views.Task.objects.get")
|
||||
@patch("api.v1.views.TaskSerializer")
|
||||
def test__get_task_status_returns_none_if_task_not_executing(
|
||||
self, mock_task_serializer, mock_task_get, authenticated_client, scans_fixture
|
||||
):
|
||||
scan = scans_fixture[0]
|
||||
scan.state = StateChoices.COMPLETED
|
||||
scan.output_location = "dummy"
|
||||
scan.save()
|
||||
|
||||
task = Task.objects.create(tenant_id=scan.tenant_id)
|
||||
mock_task_get.return_value = task
|
||||
mock_task_serializer.return_value.data = {
|
||||
"id": str(task.id),
|
||||
"state": StateChoices.COMPLETED,
|
||||
}
|
||||
|
||||
url = reverse("scan-report", kwargs={"pk": scan.id})
|
||||
response = authenticated_client.get(url)
|
||||
|
||||
assert response.status_code == status.HTTP_404_NOT_FOUND
|
||||
|
||||
@patch("api.v1.views.get_s3_client")
|
||||
@patch("api.v1.views.sentry_sdk.capture_exception")
|
||||
def test_compliance_list_objects_client_error(
|
||||
self,
|
||||
mock_sentry_capture,
|
||||
mock_get_s3_client,
|
||||
authenticated_client,
|
||||
scans_fixture,
|
||||
):
|
||||
scan = scans_fixture[0]
|
||||
scan.output_location = "s3://test-bucket/path/to/scan.zip"
|
||||
scan.state = StateChoices.COMPLETED
|
||||
scan.save()
|
||||
|
||||
fake_client = MagicMock()
|
||||
fake_client.list_objects_v2.side_effect = ClientError(
|
||||
{"Error": {"Code": "InternalError"}}, "ListObjectsV2"
|
||||
)
|
||||
mock_get_s3_client.return_value = fake_client
|
||||
|
||||
framework = get_compliance_frameworks(scan.provider.provider)[0]
|
||||
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": framework})
|
||||
response = authenticated_client.get(url)
|
||||
|
||||
assert response.status_code == status.HTTP_502_BAD_GATEWAY
|
||||
assert (
|
||||
response.json()["errors"]["detail"]
|
||||
== "Unable to list compliance files in S3: encountered an AWS error."
|
||||
)
|
||||
mock_sentry_capture.assert_called()
|
||||
|
||||
@patch("api.v1.views.get_s3_client")
|
||||
def test_report_s3_nosuchkey(
|
||||
self, mock_get_s3_client, authenticated_client, scans_fixture
|
||||
):
|
||||
scan = scans_fixture[0]
|
||||
scan.output_location = "s3://test-bucket/report.zip"
|
||||
scan.state = StateChoices.COMPLETED
|
||||
scan.save()
|
||||
|
||||
fake_client = MagicMock()
|
||||
fake_client.get_object.side_effect = ClientError(
|
||||
{"Error": {"Code": "NoSuchKey"}}, "GetObject"
|
||||
)
|
||||
mock_get_s3_client.return_value = fake_client
|
||||
|
||||
url = reverse("scan-report", kwargs={"pk": scan.id})
|
||||
response = authenticated_client.get(url)
|
||||
|
||||
assert response.status_code == status.HTTP_404_NOT_FOUND
|
||||
assert response.json()["errors"]["detail"] == "The scan has no reports."
|
||||
|
||||
@patch("api.v1.views.get_s3_client")
|
||||
def test_report_s3_client_error_other(
|
||||
self, mock_get_s3_client, authenticated_client, scans_fixture
|
||||
):
|
||||
scan = scans_fixture[0]
|
||||
scan.output_location = "s3://test-bucket/report.zip"
|
||||
scan.state = StateChoices.COMPLETED
|
||||
scan.save()
|
||||
|
||||
fake_client = MagicMock()
|
||||
fake_client.get_object.side_effect = ClientError(
|
||||
{"Error": {"Code": "AccessDenied"}}, "GetObject"
|
||||
)
|
||||
mock_get_s3_client.return_value = fake_client
|
||||
|
||||
url = reverse("scan-report", kwargs={"pk": scan.id})
|
||||
response = authenticated_client.get(url)
|
||||
|
||||
assert response.status_code == status.HTTP_403_FORBIDDEN
|
||||
assert (
|
||||
response.json()["errors"]["detail"]
|
||||
== "There is a problem with credentials."
|
||||
)
|
||||
assert response.status_code == 200
|
||||
assert response.content == file_content
|
||||
content_disposition = response.get("Content-Disposition")
|
||||
assert content_disposition.startswith('attachment; filename="')
|
||||
assert f'filename="{file_path.name}"' in content_disposition
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
@@ -3167,9 +2837,7 @@ class TestFindingViewSet:
|
||||
)
|
||||
assert response.status_code == status.HTTP_404_NOT_FOUND
|
||||
|
||||
def test_findings_metadata_retrieve(
|
||||
self, authenticated_client, findings_fixture, backfill_scan_metadata_fixture
|
||||
):
|
||||
def test_findings_metadata_retrieve(self, authenticated_client, findings_fixture):
|
||||
finding_1, *_ = findings_fixture
|
||||
response = authenticated_client.get(
|
||||
reverse("finding-metadata"),
|
||||
@@ -3192,14 +2860,14 @@ class TestFindingViewSet:
|
||||
)
|
||||
# assert data["data"]["attributes"]["tags"] == expected_tags
|
||||
|
||||
def test_findings_metadata_resource_filter_retrieve(
|
||||
self, authenticated_client, findings_fixture, backfill_scan_metadata_fixture
|
||||
def test_findings_metadata_severity_retrieve(
|
||||
self, authenticated_client, findings_fixture
|
||||
):
|
||||
finding_1, *_ = findings_fixture
|
||||
response = authenticated_client.get(
|
||||
reverse("finding-metadata"),
|
||||
{
|
||||
"filter[region]": "eu-west-1",
|
||||
"filter[severity__in]": ["low", "medium"],
|
||||
"filter[inserted_at]": finding_1.inserted_at.strftime("%Y-%m-%d"),
|
||||
},
|
||||
)
|
||||
@@ -3250,29 +2918,6 @@ class TestFindingViewSet:
|
||||
]
|
||||
}
|
||||
|
||||
def test_findings_latest(self, authenticated_client, latest_scan_finding):
|
||||
response = authenticated_client.get(
|
||||
reverse("finding-latest"),
|
||||
)
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
# The latest scan only has one finding, in comparison with `GET /findings`
|
||||
assert len(response.json()["data"]) == 1
|
||||
assert (
|
||||
response.json()["data"][0]["attributes"]["status"]
|
||||
== latest_scan_finding.status
|
||||
)
|
||||
|
||||
def test_findings_metadata_latest(self, authenticated_client, latest_scan_finding):
|
||||
response = authenticated_client.get(
|
||||
reverse("finding-metadata_latest"),
|
||||
)
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
attributes = response.json()["data"]["attributes"]
|
||||
|
||||
assert attributes["services"] == latest_scan_finding.resource_services
|
||||
assert attributes["regions"] == latest_scan_finding.resource_regions
|
||||
assert attributes["resource_types"] == latest_scan_finding.resource_types
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
class TestJWTFields:
|
||||
@@ -4910,8 +4555,9 @@ class TestOverviewViewSet:
|
||||
assert response.json()["data"][0]["attributes"]["findings"]["pass"] == 2
|
||||
assert response.json()["data"][0]["attributes"]["findings"]["fail"] == 1
|
||||
assert response.json()["data"][0]["attributes"]["findings"]["muted"] == 1
|
||||
# Since we rely on completed scans, there are only 2 resources now
|
||||
assert response.json()["data"][0]["attributes"]["resources"]["total"] == 2
|
||||
assert response.json()["data"][0]["attributes"]["resources"]["total"] == len(
|
||||
resources_fixture
|
||||
)
|
||||
|
||||
def test_overview_services_list_no_required_filters(
|
||||
self, authenticated_client, scan_summaries_fixture
|
||||
|
||||
@@ -1,14 +1,11 @@
|
||||
from datetime import datetime, timezone
|
||||
|
||||
from allauth.socialaccount.providers.oauth2.client import OAuth2Client
|
||||
from django.contrib.postgres.aggregates import ArrayAgg
|
||||
from django.db.models import Subquery
|
||||
from rest_framework.exceptions import NotFound, ValidationError
|
||||
|
||||
from api.db_router import MainRouter
|
||||
from api.exceptions import InvitationTokenExpiredException
|
||||
from api.models import Invitation, Provider, Resource
|
||||
from api.v1.serializers import FindingMetadataSerializer
|
||||
from api.models import Invitation, Provider
|
||||
from prowler.providers.aws.aws_provider import AwsProvider
|
||||
from prowler.providers.azure.azure_provider import AzureProvider
|
||||
from prowler.providers.common.models import Connection
|
||||
@@ -208,33 +205,3 @@ def validate_invitation(
|
||||
)
|
||||
|
||||
return invitation
|
||||
|
||||
|
||||
# ToRemove after removing the fallback mechanism in /findings/metadata
|
||||
def get_findings_metadata_no_aggregations(tenant_id: str, filtered_queryset):
|
||||
filtered_ids = filtered_queryset.order_by().values("id")
|
||||
|
||||
relevant_resources = Resource.all_objects.filter(
|
||||
tenant_id=tenant_id, findings__id__in=Subquery(filtered_ids)
|
||||
).only("service", "region", "type")
|
||||
|
||||
aggregation = relevant_resources.aggregate(
|
||||
services=ArrayAgg("service", flat=True),
|
||||
regions=ArrayAgg("region", flat=True),
|
||||
resource_types=ArrayAgg("type", flat=True),
|
||||
)
|
||||
|
||||
services = sorted(set(aggregation["services"] or []))
|
||||
regions = sorted({region for region in aggregation["regions"] or [] if region})
|
||||
resource_types = sorted(set(aggregation["resource_types"] or []))
|
||||
|
||||
result = {
|
||||
"services": services,
|
||||
"regions": regions,
|
||||
"resource_types": resource_types,
|
||||
}
|
||||
|
||||
serializer = FindingMetadataSerializer(data=result)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
|
||||
return serializer.data
|
||||
|
||||
@@ -1,33 +0,0 @@
|
||||
from rest_framework.response import Response
|
||||
|
||||
|
||||
class PaginateByPkMixin:
|
||||
"""
|
||||
Mixin to paginate on a list of PKs (cheaper than heavy JOINs),
|
||||
re-fetch the full objects with the desired select/prefetch,
|
||||
re-sort them to preserve DB ordering, then serialize + return.
|
||||
"""
|
||||
|
||||
def paginate_by_pk(
|
||||
self,
|
||||
request, # noqa: F841
|
||||
base_queryset,
|
||||
manager,
|
||||
select_related: list[str] | None = None,
|
||||
prefetch_related: list[str] | None = None,
|
||||
) -> Response:
|
||||
pk_list = base_queryset.values_list("id", flat=True)
|
||||
page = self.paginate_queryset(pk_list)
|
||||
if page is None:
|
||||
return Response(self.get_serializer(base_queryset, many=True).data)
|
||||
|
||||
queryset = manager.filter(id__in=page)
|
||||
if select_related:
|
||||
queryset = queryset.select_related(*select_related)
|
||||
if prefetch_related:
|
||||
queryset = queryset.prefetch_related(*prefetch_related)
|
||||
|
||||
queryset = sorted(queryset, key=lambda obj: page.index(obj.id))
|
||||
|
||||
serialized = self.get_serializer(queryset, many=True).data
|
||||
return self.get_paginated_response(serialized)
|
||||
@@ -960,15 +960,6 @@ class ScanReportSerializer(serializers.Serializer):
|
||||
fields = ["id"]
|
||||
|
||||
|
||||
class ScanComplianceReportSerializer(serializers.Serializer):
|
||||
id = serializers.CharField(source="scan")
|
||||
name = serializers.CharField()
|
||||
|
||||
class Meta:
|
||||
resource_name = "scan-reports"
|
||||
fields = ["id", "name"]
|
||||
|
||||
|
||||
class ResourceTagSerializer(RLSSerializer):
|
||||
"""
|
||||
Serializer for the ResourceTag model
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
import glob
|
||||
import os
|
||||
from datetime import datetime, timedelta, timezone
|
||||
|
||||
import sentry_sdk
|
||||
from allauth.socialaccount.providers.github.views import GitHubOAuth2Adapter
|
||||
@@ -21,7 +20,6 @@ from django.db.models import Count, Exists, F, OuterRef, Prefetch, Q, Subquery,
|
||||
from django.db.models.functions import Coalesce
|
||||
from django.http import HttpResponse
|
||||
from django.urls import reverse
|
||||
from django.utils.dateparse import parse_date
|
||||
from django.utils.decorators import method_decorator
|
||||
from django.views.decorators.cache import cache_control
|
||||
from django_celery_beat.models import PeriodicTask
|
||||
@@ -50,7 +48,6 @@ from rest_framework_simplejwt.exceptions import InvalidToken, TokenError
|
||||
from tasks.beat import schedule_provider_scan
|
||||
from tasks.jobs.export import get_s3_client
|
||||
from tasks.tasks import (
|
||||
backfill_scan_resource_summaries_task,
|
||||
check_provider_connection_task,
|
||||
delete_provider_task,
|
||||
delete_tenant_task,
|
||||
@@ -58,14 +55,12 @@ from tasks.tasks import (
|
||||
)
|
||||
|
||||
from api.base_views import BaseRLSViewSet, BaseTenantViewset, BaseUserViewset
|
||||
from api.compliance import get_compliance_frameworks
|
||||
from api.db_router import MainRouter
|
||||
from api.filters import (
|
||||
ComplianceOverviewFilter,
|
||||
FindingFilter,
|
||||
IntegrationFilter,
|
||||
InvitationFilter,
|
||||
LatestFindingFilter,
|
||||
MembershipFilter,
|
||||
ProviderFilter,
|
||||
ProviderGroupFilter,
|
||||
@@ -91,7 +86,6 @@ from api.models import (
|
||||
ProviderSecret,
|
||||
Resource,
|
||||
ResourceFindingMapping,
|
||||
ResourceScanSummary,
|
||||
Role,
|
||||
RoleProviderGroupRelationship,
|
||||
Scan,
|
||||
@@ -105,13 +99,7 @@ from api.models import (
|
||||
from api.pagination import ComplianceOverviewPagination
|
||||
from api.rbac.permissions import Permissions, get_providers, get_role
|
||||
from api.rls import Tenant
|
||||
from api.utils import (
|
||||
CustomOAuth2Client,
|
||||
get_findings_metadata_no_aggregations,
|
||||
validate_invitation,
|
||||
)
|
||||
from api.uuid_utils import datetime_to_uuid7, uuid7_start
|
||||
from api.v1.mixins import PaginateByPkMixin
|
||||
from api.utils import CustomOAuth2Client, validate_invitation
|
||||
from api.v1.serializers import (
|
||||
ComplianceOverviewFullSerializer,
|
||||
ComplianceOverviewMetadataSerializer,
|
||||
@@ -146,7 +134,6 @@ from api.v1.serializers import (
|
||||
RoleProviderGroupRelationshipSerializer,
|
||||
RoleSerializer,
|
||||
RoleUpdateSerializer,
|
||||
ScanComplianceReportSerializer,
|
||||
ScanCreateSerializer,
|
||||
ScanReportSerializer,
|
||||
ScanSerializer,
|
||||
@@ -260,7 +247,7 @@ class SchemaView(SpectacularAPIView):
|
||||
|
||||
def get(self, request, *args, **kwargs):
|
||||
spectacular_settings.TITLE = "Prowler API"
|
||||
spectacular_settings.VERSION = "1.8.1"
|
||||
spectacular_settings.VERSION = "1.6.0"
|
||||
spectacular_settings.DESCRIPTION = (
|
||||
"Prowler API specification.\n\nThis file is auto-generated."
|
||||
)
|
||||
@@ -1163,27 +1150,6 @@ class ProviderViewSet(BaseRLSViewSet):
|
||||
404: OpenApiResponse(description="The scan has no reports"),
|
||||
},
|
||||
),
|
||||
compliance=extend_schema(
|
||||
tags=["Scan"],
|
||||
summary="Retrieve compliance report as CSV",
|
||||
description="Download a specific compliance report (e.g., 'cis_1.4_aws') as a CSV file.",
|
||||
parameters=[
|
||||
OpenApiParameter(
|
||||
name="name",
|
||||
type=str,
|
||||
location=OpenApiParameter.PATH,
|
||||
required=True,
|
||||
description="The compliance report name, like 'cis_1.4_aws'",
|
||||
),
|
||||
],
|
||||
responses={
|
||||
200: OpenApiResponse(
|
||||
description="CSV file containing the compliance report"
|
||||
),
|
||||
404: OpenApiResponse(description="Compliance report not found"),
|
||||
},
|
||||
request=None,
|
||||
),
|
||||
)
|
||||
@method_decorator(CACHE_DECORATOR, name="list")
|
||||
@method_decorator(CACHE_DECORATOR, name="retrieve")
|
||||
@@ -1236,10 +1202,6 @@ class ScanViewSet(BaseRLSViewSet):
|
||||
if hasattr(self, "response_serializer_class"):
|
||||
return self.response_serializer_class
|
||||
return ScanReportSerializer
|
||||
elif self.action == "compliance":
|
||||
if hasattr(self, "response_serializer_class"):
|
||||
return self.response_serializer_class
|
||||
return ScanComplianceReportSerializer
|
||||
return super().get_serializer_class()
|
||||
|
||||
def partial_update(self, request, *args, **kwargs):
|
||||
@@ -1257,111 +1219,70 @@ class ScanViewSet(BaseRLSViewSet):
|
||||
)
|
||||
return Response(data=read_serializer.data, status=status.HTTP_200_OK)
|
||||
|
||||
def _get_task_status(self, scan_instance):
|
||||
"""
|
||||
Returns task status if the scan or its associated report-generation task is still executing.
|
||||
@action(detail=True, methods=["get"], url_name="report")
|
||||
def report(self, request, pk=None):
|
||||
scan_instance = self.get_object()
|
||||
|
||||
If the scan is in an EXECUTING state or if a background task related to report generation
|
||||
is found and also executing, this method returns a 202 Accepted response with the task
|
||||
metadata and a `Content-Location` header pointing to the task detail endpoint.
|
||||
if scan_instance.state == StateChoices.EXECUTING:
|
||||
# If the scan is still running, return the task
|
||||
prowler_task = Task.objects.get(id=scan_instance.task.id)
|
||||
self.response_serializer_class = TaskSerializer
|
||||
output_serializer = self.get_serializer(prowler_task)
|
||||
return Response(
|
||||
data=output_serializer.data,
|
||||
status=status.HTTP_202_ACCEPTED,
|
||||
headers={
|
||||
"Content-Location": reverse(
|
||||
"task-detail", kwargs={"pk": output_serializer.data["id"]}
|
||||
)
|
||||
},
|
||||
)
|
||||
|
||||
Args:
|
||||
scan_instance (Scan): The scan instance for which the task status is being checked.
|
||||
|
||||
Returns:
|
||||
Response or None:
|
||||
- A `Response` with HTTP 202 status and serialized task data if the task is executing.
|
||||
- `None` if no running task is found or if the task has already completed.
|
||||
"""
|
||||
task = None
|
||||
|
||||
if scan_instance.state == StateChoices.EXECUTING and scan_instance.task:
|
||||
task = scan_instance.task
|
||||
else:
|
||||
try:
|
||||
task = Task.objects.get(
|
||||
task_runner_task__task_name="scan-report",
|
||||
task_runner_task__task_args__contains=str(scan_instance.id),
|
||||
try:
|
||||
output_celery_task = Task.objects.get(
|
||||
task_runner_task__task_name="scan-report",
|
||||
task_runner_task__task_args__contains=pk,
|
||||
)
|
||||
self.response_serializer_class = TaskSerializer
|
||||
output_serializer = self.get_serializer(output_celery_task)
|
||||
if output_serializer.data["state"] == StateChoices.EXECUTING:
|
||||
# If the task is still running, return the task
|
||||
return Response(
|
||||
data=output_serializer.data,
|
||||
status=status.HTTP_202_ACCEPTED,
|
||||
headers={
|
||||
"Content-Location": reverse(
|
||||
"task-detail", kwargs={"pk": output_serializer.data["id"]}
|
||||
)
|
||||
},
|
||||
)
|
||||
except Task.DoesNotExist:
|
||||
return None
|
||||
except Task.DoesNotExist:
|
||||
# If the task does not exist, it means that the task is removed from the database
|
||||
pass
|
||||
|
||||
self.response_serializer_class = TaskSerializer
|
||||
serializer = self.get_serializer(task)
|
||||
output_location = scan_instance.output_location
|
||||
if not output_location:
|
||||
return Response(
|
||||
{"detail": "The scan has no reports."},
|
||||
status=status.HTTP_404_NOT_FOUND,
|
||||
)
|
||||
|
||||
if serializer.data.get("state") != StateChoices.EXECUTING:
|
||||
return None
|
||||
|
||||
return Response(
|
||||
data=serializer.data,
|
||||
status=status.HTTP_202_ACCEPTED,
|
||||
headers={
|
||||
"Content-Location": reverse(
|
||||
"task-detail", kwargs={"pk": serializer.data["id"]}
|
||||
)
|
||||
},
|
||||
)
|
||||
|
||||
def _load_file(self, path_pattern, s3=False, bucket=None, list_objects=False):
|
||||
"""
|
||||
Loads a binary file (e.g., ZIP or CSV) and returns its content and filename.
|
||||
|
||||
Depending on the input parameters, this method supports loading:
|
||||
- From S3 using a direct key.
|
||||
- From S3 by listing objects under a prefix and matching suffix.
|
||||
- From the local filesystem using glob pattern matching.
|
||||
|
||||
Args:
|
||||
path_pattern (str): The key or glob pattern representing the file location.
|
||||
s3 (bool, optional): Whether the file is stored in S3. Defaults to False.
|
||||
bucket (str, optional): The name of the S3 bucket, required if `s3=True`. Defaults to None.
|
||||
list_objects (bool, optional): If True and `s3=True`, list objects by prefix to find the file. Defaults to False.
|
||||
|
||||
Returns:
|
||||
tuple[bytes, str]: A tuple containing the file content as bytes and the filename if successful.
|
||||
Response: A DRF `Response` object with an appropriate status and error detail if an error occurs.
|
||||
"""
|
||||
if s3:
|
||||
if scan_instance.output_location.startswith("s3://"):
|
||||
try:
|
||||
client = get_s3_client()
|
||||
s3_client = get_s3_client()
|
||||
except (ClientError, NoCredentialsError, ParamValidationError):
|
||||
return Response(
|
||||
{"detail": "There is a problem with credentials."},
|
||||
status=status.HTTP_403_FORBIDDEN,
|
||||
)
|
||||
if list_objects:
|
||||
# list keys under prefix then match suffix
|
||||
prefix = os.path.dirname(path_pattern)
|
||||
suffix = os.path.basename(path_pattern)
|
||||
try:
|
||||
resp = client.list_objects_v2(Bucket=bucket, Prefix=prefix)
|
||||
except ClientError as e:
|
||||
sentry_sdk.capture_exception(e)
|
||||
return Response(
|
||||
{
|
||||
"detail": "Unable to list compliance files in S3: encountered an AWS error."
|
||||
},
|
||||
status=status.HTTP_502_BAD_GATEWAY,
|
||||
)
|
||||
contents = resp.get("Contents", [])
|
||||
keys = [obj["Key"] for obj in contents if obj["Key"].endswith(suffix)]
|
||||
if not keys:
|
||||
return Response(
|
||||
{
|
||||
"detail": f"No compliance file found for name '{os.path.splitext(suffix)[0]}'."
|
||||
},
|
||||
status=status.HTTP_404_NOT_FOUND,
|
||||
)
|
||||
# path_pattern here is prefix, but in compliance we build correct suffix check before
|
||||
key = keys[0]
|
||||
else:
|
||||
# path_pattern is exact key
|
||||
key = path_pattern
|
||||
|
||||
bucket_name = env.str("DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET")
|
||||
key = output_location[len(f"s3://{bucket_name}/") :]
|
||||
try:
|
||||
s3_obj = client.get_object(Bucket=bucket, Key=key)
|
||||
s3_object = s3_client.get_object(Bucket=bucket_name, Key=key)
|
||||
except ClientError as e:
|
||||
code = e.response.get("Error", {}).get("Code")
|
||||
if code == "NoSuchKey":
|
||||
error_code = e.response.get("Error", {}).get("Code")
|
||||
if error_code == "NoSuchKey":
|
||||
return Response(
|
||||
{"detail": "The scan has no reports."},
|
||||
status=status.HTTP_404_NOT_FOUND,
|
||||
@@ -1370,97 +1291,28 @@ class ScanViewSet(BaseRLSViewSet):
|
||||
{"detail": "There is a problem with credentials."},
|
||||
status=status.HTTP_403_FORBIDDEN,
|
||||
)
|
||||
content = s3_obj["Body"].read()
|
||||
filename = os.path.basename(key)
|
||||
file_content = s3_object["Body"].read()
|
||||
filename = os.path.basename(output_location.split("/")[-1])
|
||||
else:
|
||||
files = glob.glob(path_pattern)
|
||||
if not files:
|
||||
zip_files = glob.glob(output_location)
|
||||
try:
|
||||
file_path = zip_files[0]
|
||||
except IndexError as e:
|
||||
sentry_sdk.capture_exception(e)
|
||||
return Response(
|
||||
{"detail": "The scan has no reports."},
|
||||
status=status.HTTP_404_NOT_FOUND,
|
||||
)
|
||||
filepath = files[0]
|
||||
with open(filepath, "rb") as f:
|
||||
content = f.read()
|
||||
filename = os.path.basename(filepath)
|
||||
with open(file_path, "rb") as f:
|
||||
file_content = f.read()
|
||||
filename = os.path.basename(file_path)
|
||||
|
||||
return content, filename
|
||||
|
||||
def _serve_file(self, content, filename, content_type):
|
||||
response = HttpResponse(content, content_type=content_type)
|
||||
response = HttpResponse(
|
||||
file_content, content_type="application/x-zip-compressed"
|
||||
)
|
||||
response["Content-Disposition"] = f'attachment; filename="{filename}"'
|
||||
|
||||
return response
|
||||
|
||||
@action(detail=True, methods=["get"], url_name="report")
|
||||
def report(self, request, pk=None):
|
||||
scan = self.get_object()
|
||||
# Check for executing tasks
|
||||
running_resp = self._get_task_status(scan)
|
||||
if running_resp:
|
||||
return running_resp
|
||||
|
||||
if not scan.output_location:
|
||||
return Response(
|
||||
{"detail": "The scan has no reports."}, status=status.HTTP_404_NOT_FOUND
|
||||
)
|
||||
|
||||
if scan.output_location.startswith("s3://"):
|
||||
bucket = env.str("DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET", "")
|
||||
key_prefix = scan.output_location.removeprefix(f"s3://{bucket}/")
|
||||
loader = self._load_file(
|
||||
key_prefix, s3=True, bucket=bucket, list_objects=False
|
||||
)
|
||||
else:
|
||||
loader = self._load_file(scan.output_location, s3=False)
|
||||
|
||||
if isinstance(loader, Response):
|
||||
return loader
|
||||
|
||||
content, filename = loader
|
||||
return self._serve_file(content, filename, "application/x-zip-compressed")
|
||||
|
||||
@action(
|
||||
detail=True,
|
||||
methods=["get"],
|
||||
url_path="compliance/(?P<name>[^/]+)",
|
||||
url_name="compliance",
|
||||
)
|
||||
def compliance(self, request, pk=None, name=None):
|
||||
scan = self.get_object()
|
||||
if name not in get_compliance_frameworks(scan.provider.provider):
|
||||
return Response(
|
||||
{"detail": f"Compliance '{name}' not found."},
|
||||
status=status.HTTP_404_NOT_FOUND,
|
||||
)
|
||||
|
||||
running_resp = self._get_task_status(scan)
|
||||
if running_resp:
|
||||
return running_resp
|
||||
|
||||
if not scan.output_location:
|
||||
return Response(
|
||||
{"detail": "The scan has no reports."}, status=status.HTTP_404_NOT_FOUND
|
||||
)
|
||||
|
||||
if scan.output_location.startswith("s3://"):
|
||||
bucket = env.str("DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET", "")
|
||||
key_prefix = scan.output_location.removeprefix(f"s3://{bucket}/")
|
||||
prefix = os.path.join(
|
||||
os.path.dirname(key_prefix), "compliance", f"{name}.csv"
|
||||
)
|
||||
loader = self._load_file(prefix, s3=True, bucket=bucket, list_objects=True)
|
||||
else:
|
||||
base = os.path.dirname(scan.output_location)
|
||||
pattern = os.path.join(base, "compliance", f"*_{name}.csv")
|
||||
loader = self._load_file(pattern, s3=False)
|
||||
|
||||
if isinstance(loader, Response):
|
||||
return loader
|
||||
|
||||
content, filename = loader
|
||||
return self._serve_file(content, filename, "text/csv")
|
||||
|
||||
def create(self, request, *args, **kwargs):
|
||||
input_serializer = self.get_serializer(data=request.data)
|
||||
input_serializer.is_valid(raise_exception=True)
|
||||
@@ -1673,24 +1525,10 @@ class ResourceViewSet(BaseRLSViewSet):
|
||||
],
|
||||
filters=True,
|
||||
),
|
||||
latest=extend_schema(
|
||||
tags=["Finding"],
|
||||
summary="List the latest findings",
|
||||
description="Retrieve a list of the latest findings from the latest scans for each provider with options for "
|
||||
"filtering by various criteria.",
|
||||
filters=True,
|
||||
),
|
||||
metadata_latest=extend_schema(
|
||||
tags=["Finding"],
|
||||
summary="Retrieve metadata values from the latest findings",
|
||||
description="Fetch unique metadata values from a set of findings from the latest scans for each provider. "
|
||||
"This is useful for dynamic filtering.",
|
||||
filters=True,
|
||||
),
|
||||
)
|
||||
@method_decorator(CACHE_DECORATOR, name="list")
|
||||
@method_decorator(CACHE_DECORATOR, name="retrieve")
|
||||
class FindingViewSet(PaginateByPkMixin, BaseRLSViewSet):
|
||||
class FindingViewSet(BaseRLSViewSet):
|
||||
queryset = Finding.all_objects.all()
|
||||
serializer_class = FindingSerializer
|
||||
filterset_class = FindingFilter
|
||||
@@ -1722,16 +1560,11 @@ class FindingViewSet(PaginateByPkMixin, BaseRLSViewSet):
|
||||
def get_serializer_class(self):
|
||||
if self.action == "findings_services_regions":
|
||||
return FindingDynamicFilterSerializer
|
||||
elif self.action in ["metadata", "metadata_latest"]:
|
||||
elif self.action == "metadata":
|
||||
return FindingMetadataSerializer
|
||||
|
||||
return super().get_serializer_class()
|
||||
|
||||
def get_filterset_class(self):
|
||||
if self.action in ["latest", "metadata_latest"]:
|
||||
return LatestFindingFilter
|
||||
return FindingFilter
|
||||
|
||||
def get_queryset(self):
|
||||
tenant_id = self.request.tenant_id
|
||||
user_roles = get_role(self.request.user)
|
||||
@@ -1771,14 +1604,21 @@ class FindingViewSet(PaginateByPkMixin, BaseRLSViewSet):
|
||||
return super().filter_queryset(queryset)
|
||||
|
||||
def list(self, request, *args, **kwargs):
|
||||
filtered_queryset = self.filter_queryset(self.get_queryset())
|
||||
return self.paginate_by_pk(
|
||||
request,
|
||||
filtered_queryset,
|
||||
manager=Finding.all_objects,
|
||||
select_related=["scan"],
|
||||
prefetch_related=["resources"],
|
||||
)
|
||||
base_qs = self.filter_queryset(self.get_queryset())
|
||||
paginated_ids = self.paginate_queryset(base_qs.values_list("id", flat=True))
|
||||
if paginated_ids is not None:
|
||||
ids = list(paginated_ids)
|
||||
findings = (
|
||||
Finding.all_objects.filter(tenant_id=self.request.tenant_id, id__in=ids)
|
||||
.select_related("scan")
|
||||
.prefetch_related("resources")
|
||||
)
|
||||
# Re-sort in Python to preserve ordering:
|
||||
findings = sorted(findings, key=lambda x: ids.index(x.id))
|
||||
serializer = self.get_serializer(findings, many=True)
|
||||
return self.get_paginated_response(serializer.data)
|
||||
serializer = self.get_serializer(base_qs, many=True)
|
||||
return Response(serializer.data)
|
||||
|
||||
@action(detail=False, methods=["get"], url_name="findings_services_regions")
|
||||
def findings_services_regions(self, request):
|
||||
@@ -1803,104 +1643,26 @@ class FindingViewSet(PaginateByPkMixin, BaseRLSViewSet):
|
||||
|
||||
@action(detail=False, methods=["get"], url_name="metadata")
|
||||
def metadata(self, request):
|
||||
# Force filter validation
|
||||
filtered_queryset = self.filter_queryset(self.get_queryset())
|
||||
tenant_id = self.request.tenant_id
|
||||
queryset = self.get_queryset()
|
||||
filtered_queryset = self.filter_queryset(queryset)
|
||||
|
||||
tenant_id = request.tenant_id
|
||||
query_params = request.query_params
|
||||
filtered_ids = filtered_queryset.order_by().values("id")
|
||||
|
||||
queryset = ResourceScanSummary.objects.filter(tenant_id=tenant_id)
|
||||
scan_based_filters = {}
|
||||
relevant_resources = Resource.all_objects.filter(
|
||||
tenant_id=tenant_id, findings__id__in=Subquery(filtered_ids)
|
||||
).only("service", "region", "type")
|
||||
|
||||
if scans := query_params.get("filter[scan__in]") or query_params.get(
|
||||
"filter[scan]"
|
||||
):
|
||||
queryset = queryset.filter(scan_id__in=scans.split(","))
|
||||
scan_based_filters = {"id__in": scans.split(",")}
|
||||
else:
|
||||
exact = query_params.get("filter[inserted_at]")
|
||||
gte = query_params.get("filter[inserted_at__gte]")
|
||||
lte = query_params.get("filter[inserted_at__lte]")
|
||||
|
||||
date_filters = {}
|
||||
if exact:
|
||||
date = parse_date(exact)
|
||||
datetime_start = datetime.combine(
|
||||
date, datetime.min.time(), tzinfo=timezone.utc
|
||||
)
|
||||
datetime_end = datetime_start + timedelta(days=1)
|
||||
date_filters["scan_id__gte"] = uuid7_start(
|
||||
datetime_to_uuid7(datetime_start)
|
||||
)
|
||||
date_filters["scan_id__lt"] = uuid7_start(
|
||||
datetime_to_uuid7(datetime_end)
|
||||
)
|
||||
else:
|
||||
if gte:
|
||||
date_start = parse_date(gte)
|
||||
datetime_start = datetime.combine(
|
||||
date_start, datetime.min.time(), tzinfo=timezone.utc
|
||||
)
|
||||
date_filters["scan_id__gte"] = uuid7_start(
|
||||
datetime_to_uuid7(datetime_start)
|
||||
)
|
||||
if lte:
|
||||
date_end = parse_date(lte)
|
||||
datetime_end = datetime.combine(
|
||||
date_end + timedelta(days=1),
|
||||
datetime.min.time(),
|
||||
tzinfo=timezone.utc,
|
||||
)
|
||||
date_filters["scan_id__lt"] = uuid7_start(
|
||||
datetime_to_uuid7(datetime_end)
|
||||
)
|
||||
|
||||
if date_filters:
|
||||
queryset = queryset.filter(**date_filters)
|
||||
scan_based_filters = {
|
||||
key.lstrip("scan_"): value for key, value in date_filters.items()
|
||||
}
|
||||
|
||||
# ToRemove: Temporary fallback mechanism
|
||||
if not queryset.exists():
|
||||
scan_ids = Scan.objects.filter(
|
||||
tenant_id=tenant_id, **scan_based_filters
|
||||
).values_list("id", flat=True)
|
||||
for scan_id in scan_ids:
|
||||
backfill_scan_resource_summaries_task.apply_async(
|
||||
kwargs={"tenant_id": tenant_id, "scan_id": scan_id}
|
||||
)
|
||||
return Response(
|
||||
get_findings_metadata_no_aggregations(tenant_id, filtered_queryset)
|
||||
)
|
||||
|
||||
if service_filter := query_params.get("filter[service]") or query_params.get(
|
||||
"filter[service__in]"
|
||||
):
|
||||
queryset = queryset.filter(service__in=service_filter.split(","))
|
||||
if region_filter := query_params.get("filter[region]") or query_params.get(
|
||||
"filter[region__in]"
|
||||
):
|
||||
queryset = queryset.filter(region__in=region_filter.split(","))
|
||||
if resource_type_filter := query_params.get(
|
||||
"filter[resource_type]"
|
||||
) or query_params.get("filter[resource_type__in]"):
|
||||
queryset = queryset.filter(
|
||||
resource_type__in=resource_type_filter.split(",")
|
||||
)
|
||||
|
||||
services = list(
|
||||
queryset.values_list("service", flat=True).distinct().order_by("service")
|
||||
)
|
||||
regions = list(
|
||||
queryset.values_list("region", flat=True).distinct().order_by("region")
|
||||
)
|
||||
resource_types = list(
|
||||
queryset.values_list("resource_type", flat=True)
|
||||
.distinct()
|
||||
.order_by("resource_type")
|
||||
aggregation = relevant_resources.aggregate(
|
||||
services=ArrayAgg("service", flat=True),
|
||||
regions=ArrayAgg("region", flat=True),
|
||||
resource_types=ArrayAgg("type", flat=True),
|
||||
)
|
||||
|
||||
services = sorted(set(aggregation["services"] or []))
|
||||
regions = sorted({region for region in aggregation["regions"] or [] if region})
|
||||
resource_types = sorted(set(aggregation["resource_types"] or []))
|
||||
|
||||
result = {
|
||||
"services": services,
|
||||
"regions": regions,
|
||||
@@ -1909,108 +1671,7 @@ class FindingViewSet(PaginateByPkMixin, BaseRLSViewSet):
|
||||
|
||||
serializer = self.get_serializer(data=result)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
return Response(serializer.data)
|
||||
|
||||
@action(detail=False, methods=["get"], url_name="latest")
|
||||
def latest(self, request):
|
||||
tenant_id = request.tenant_id
|
||||
filtered_queryset = self.filter_queryset(self.get_queryset())
|
||||
|
||||
latest_scan_ids = (
|
||||
Scan.all_objects.filter(tenant_id=tenant_id, state=StateChoices.COMPLETED)
|
||||
.order_by("provider_id", "-inserted_at")
|
||||
.distinct("provider_id")
|
||||
.values_list("id", flat=True)
|
||||
)
|
||||
filtered_queryset = filtered_queryset.filter(
|
||||
tenant_id=tenant_id, scan_id__in=latest_scan_ids
|
||||
)
|
||||
|
||||
return self.paginate_by_pk(
|
||||
request,
|
||||
filtered_queryset,
|
||||
manager=Finding.all_objects,
|
||||
select_related=["scan"],
|
||||
prefetch_related=["resources"],
|
||||
)
|
||||
|
||||
@action(
|
||||
detail=False,
|
||||
methods=["get"],
|
||||
url_name="metadata_latest",
|
||||
url_path="metadata/latest",
|
||||
)
|
||||
def metadata_latest(self, request):
|
||||
tenant_id = request.tenant_id
|
||||
query_params = request.query_params
|
||||
|
||||
latest_scans_queryset = (
|
||||
Scan.all_objects.filter(tenant_id=tenant_id, state=StateChoices.COMPLETED)
|
||||
.order_by("provider_id", "-inserted_at")
|
||||
.distinct("provider_id")
|
||||
)
|
||||
latest_scans_ids = list(latest_scans_queryset.values_list("id", flat=True))
|
||||
|
||||
queryset = ResourceScanSummary.objects.filter(
|
||||
tenant_id=tenant_id,
|
||||
scan_id__in=latest_scans_queryset.values_list("id", flat=True),
|
||||
)
|
||||
# ToRemove: Temporary fallback mechanism
|
||||
present_ids = set(
|
||||
ResourceScanSummary.objects.filter(
|
||||
tenant_id=tenant_id, scan_id__in=latest_scans_ids
|
||||
)
|
||||
.values_list("scan_id", flat=True)
|
||||
.distinct()
|
||||
)
|
||||
missing_scan_ids = [sid for sid in latest_scans_ids if sid not in present_ids]
|
||||
if missing_scan_ids:
|
||||
for scan_id in missing_scan_ids:
|
||||
backfill_scan_resource_summaries_task.apply_async(
|
||||
kwargs={"tenant_id": tenant_id, "scan_id": scan_id}
|
||||
)
|
||||
return Response(
|
||||
get_findings_metadata_no_aggregations(
|
||||
tenant_id, self.filter_queryset(self.get_queryset())
|
||||
)
|
||||
)
|
||||
|
||||
if service_filter := query_params.get("filter[service]") or query_params.get(
|
||||
"filter[service__in]"
|
||||
):
|
||||
queryset = queryset.filter(service__in=service_filter.split(","))
|
||||
if region_filter := query_params.get("filter[region]") or query_params.get(
|
||||
"filter[region__in]"
|
||||
):
|
||||
queryset = queryset.filter(region__in=region_filter.split(","))
|
||||
if resource_type_filter := query_params.get(
|
||||
"filter[resource_type]"
|
||||
) or query_params.get("filter[resource_type__in]"):
|
||||
queryset = queryset.filter(
|
||||
resource_type__in=resource_type_filter.split(",")
|
||||
)
|
||||
|
||||
services = list(
|
||||
queryset.values_list("service", flat=True).distinct().order_by("service")
|
||||
)
|
||||
regions = list(
|
||||
queryset.values_list("region", flat=True).distinct().order_by("region")
|
||||
)
|
||||
resource_types = list(
|
||||
queryset.values_list("resource_type", flat=True)
|
||||
.distinct()
|
||||
.order_by("resource_type")
|
||||
)
|
||||
|
||||
result = {
|
||||
"services": services,
|
||||
"regions": regions,
|
||||
"resource_types": resource_types,
|
||||
}
|
||||
|
||||
serializer = self.get_serializer(data=result)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
return Response(serializer.data)
|
||||
return Response(serializer.data, status=status.HTTP_200_OK)
|
||||
|
||||
|
||||
@extend_schema_view(
|
||||
@@ -2577,8 +2238,8 @@ class OverviewViewSet(BaseRLSViewSet):
|
||||
|
||||
def _get_filtered_queryset(model):
|
||||
if role.unlimited_visibility:
|
||||
return model.all_objects.filter(tenant_id=self.request.tenant_id)
|
||||
return model.all_objects.filter(
|
||||
return model.objects.filter(tenant_id=self.request.tenant_id)
|
||||
return model.objects.filter(
|
||||
tenant_id=self.request.tenant_id, scan__provider__in=providers
|
||||
)
|
||||
|
||||
@@ -2622,38 +2283,51 @@ class OverviewViewSet(BaseRLSViewSet):
|
||||
tenant_id = self.request.tenant_id
|
||||
|
||||
latest_scan_ids = (
|
||||
Scan.all_objects.filter(tenant_id=tenant_id, state=StateChoices.COMPLETED)
|
||||
Scan.objects.filter(
|
||||
tenant_id=tenant_id,
|
||||
state=StateChoices.COMPLETED,
|
||||
)
|
||||
.order_by("provider_id", "-inserted_at")
|
||||
.distinct("provider_id")
|
||||
.values_list("id", flat=True)
|
||||
)
|
||||
|
||||
resource_count_queryset = (
|
||||
Resource.all_objects.filter(
|
||||
tenant_id=tenant_id,
|
||||
provider_id=OuterRef("scan__provider_id"),
|
||||
)
|
||||
.order_by()
|
||||
.values("provider_id")
|
||||
.annotate(cnt=Count("id"))
|
||||
.values("cnt")
|
||||
)
|
||||
|
||||
overview_queryset = (
|
||||
ScanSummary.all_objects.filter(
|
||||
tenant_id=tenant_id, scan_id__in=latest_scan_ids
|
||||
)
|
||||
.values(provider=F("scan__provider__provider"))
|
||||
findings_aggregated = (
|
||||
ScanSummary.objects.filter(tenant_id=tenant_id, scan_id__in=latest_scan_ids)
|
||||
.values("scan__provider__provider")
|
||||
.annotate(
|
||||
findings_passed=Coalesce(Sum("_pass"), 0),
|
||||
findings_failed=Coalesce(Sum("fail"), 0),
|
||||
findings_muted=Coalesce(Sum("muted"), 0),
|
||||
total_findings=Coalesce(Sum("total"), 0),
|
||||
total_resources=Coalesce(Subquery(resource_count_queryset), 0),
|
||||
)
|
||||
)
|
||||
|
||||
serializer = OverviewProviderSerializer(overview_queryset, many=True)
|
||||
resources_aggregated = (
|
||||
Resource.objects.filter(tenant_id=tenant_id)
|
||||
.values("provider__provider")
|
||||
.annotate(total_resources=Count("id"))
|
||||
)
|
||||
resources_dict = {
|
||||
row["provider__provider"]: row["total_resources"]
|
||||
for row in resources_aggregated
|
||||
}
|
||||
|
||||
overview = []
|
||||
for row in findings_aggregated:
|
||||
provider_type = row["scan__provider__provider"]
|
||||
overview.append(
|
||||
{
|
||||
"provider": provider_type,
|
||||
"total_resources": resources_dict.get(provider_type, 0),
|
||||
"total_findings": row["total_findings"],
|
||||
"findings_passed": row["findings_passed"],
|
||||
"findings_failed": row["findings_failed"],
|
||||
"findings_muted": row["findings_muted"],
|
||||
}
|
||||
)
|
||||
|
||||
serializer = OverviewProviderSerializer(overview, many=True)
|
||||
return Response(serializer.data, status=status.HTTP_200_OK)
|
||||
|
||||
@action(detail=False, methods=["get"], url_name="findings")
|
||||
@@ -2662,16 +2336,22 @@ class OverviewViewSet(BaseRLSViewSet):
|
||||
queryset = self.get_queryset()
|
||||
filtered_queryset = self.filter_queryset(queryset)
|
||||
|
||||
latest_scan_ids = (
|
||||
Scan.all_objects.filter(tenant_id=tenant_id, state=StateChoices.COMPLETED)
|
||||
.order_by("provider_id", "-inserted_at")
|
||||
.distinct("provider_id")
|
||||
.values_list("id", flat=True)
|
||||
latest_scan_subquery = (
|
||||
Scan.objects.filter(
|
||||
tenant_id=tenant_id,
|
||||
state=StateChoices.COMPLETED,
|
||||
provider_id=OuterRef("scan__provider_id"),
|
||||
)
|
||||
.order_by("-inserted_at")
|
||||
.values("id")[:1]
|
||||
)
|
||||
filtered_queryset = filtered_queryset.filter(
|
||||
tenant_id=tenant_id, scan_id__in=latest_scan_ids
|
||||
|
||||
annotated_queryset = filtered_queryset.annotate(
|
||||
latest_scan_id=Subquery(latest_scan_subquery)
|
||||
)
|
||||
|
||||
filtered_queryset = annotated_queryset.filter(scan_id=F("latest_scan_id"))
|
||||
|
||||
aggregated_totals = filtered_queryset.aggregate(
|
||||
_pass=Sum("_pass") or 0,
|
||||
fail=Sum("fail") or 0,
|
||||
@@ -2701,16 +2381,22 @@ class OverviewViewSet(BaseRLSViewSet):
|
||||
queryset = self.get_queryset()
|
||||
filtered_queryset = self.filter_queryset(queryset)
|
||||
|
||||
latest_scan_ids = (
|
||||
Scan.all_objects.filter(tenant_id=tenant_id, state=StateChoices.COMPLETED)
|
||||
.order_by("provider_id", "-inserted_at")
|
||||
.distinct("provider_id")
|
||||
.values_list("id", flat=True)
|
||||
latest_scan_subquery = (
|
||||
Scan.objects.filter(
|
||||
tenant_id=tenant_id,
|
||||
state=StateChoices.COMPLETED,
|
||||
provider_id=OuterRef("scan__provider_id"),
|
||||
)
|
||||
.order_by("-inserted_at")
|
||||
.values("id")[:1]
|
||||
)
|
||||
filtered_queryset = filtered_queryset.filter(
|
||||
tenant_id=tenant_id, scan_id__in=latest_scan_ids
|
||||
|
||||
annotated_queryset = filtered_queryset.annotate(
|
||||
latest_scan_id=Subquery(latest_scan_subquery)
|
||||
)
|
||||
|
||||
filtered_queryset = annotated_queryset.filter(scan_id=F("latest_scan_id"))
|
||||
|
||||
severity_counts = (
|
||||
filtered_queryset.values("severity")
|
||||
.annotate(count=Sum("total"))
|
||||
@@ -2731,16 +2417,22 @@ class OverviewViewSet(BaseRLSViewSet):
|
||||
queryset = self.get_queryset()
|
||||
filtered_queryset = self.filter_queryset(queryset)
|
||||
|
||||
latest_scan_ids = (
|
||||
Scan.all_objects.filter(tenant_id=tenant_id, state=StateChoices.COMPLETED)
|
||||
.order_by("provider_id", "-inserted_at")
|
||||
.distinct("provider_id")
|
||||
.values_list("id", flat=True)
|
||||
latest_scan_subquery = (
|
||||
Scan.objects.filter(
|
||||
tenant_id=tenant_id,
|
||||
state=StateChoices.COMPLETED,
|
||||
provider_id=OuterRef("scan__provider_id"),
|
||||
)
|
||||
.order_by("-inserted_at")
|
||||
.values("id")[:1]
|
||||
)
|
||||
filtered_queryset = filtered_queryset.filter(
|
||||
tenant_id=tenant_id, scan_id__in=latest_scan_ids
|
||||
|
||||
annotated_queryset = filtered_queryset.annotate(
|
||||
latest_scan_id=Subquery(latest_scan_subquery)
|
||||
)
|
||||
|
||||
filtered_queryset = annotated_queryset.filter(scan_id=F("latest_scan_id"))
|
||||
|
||||
services_data = (
|
||||
filtered_queryset.values("service")
|
||||
.annotate(_pass=Sum("_pass"))
|
||||
|
||||
@@ -111,7 +111,6 @@ SPECTACULAR_SETTINGS = {
|
||||
"PREPROCESSING_HOOKS": [
|
||||
"drf_spectacular_jsonapi.hooks.fix_nested_path_parameters",
|
||||
],
|
||||
"TITLE": "API Reference - Prowler",
|
||||
}
|
||||
|
||||
WSGI_APPLICATION = "config.wsgi.application"
|
||||
@@ -238,8 +237,9 @@ DJANGO_OUTPUT_S3_AWS_SECRET_ACCESS_KEY = env.str(
|
||||
DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN = env.str("DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN", "")
|
||||
DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION = env.str("DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION", "")
|
||||
|
||||
DJANGO_DELETION_BATCH_SIZE = env.int("DJANGO_DELETION_BATCH_SIZE", 5000)
|
||||
# HTTP Security Headers
|
||||
SECURE_CONTENT_TYPE_NOSNIFF = True
|
||||
X_FRAME_OPTIONS = "DENY"
|
||||
SECURE_REFERRER_POLICY = "strict-origin-when-cross-origin"
|
||||
|
||||
DJANGO_DELETION_BATCH_SIZE = env.int("DJANGO_DELETION_BATCH_SIZE", 5000)
|
||||
|
||||
@@ -39,9 +39,6 @@ IGNORED_EXCEPTIONS = [
|
||||
"RequestExpired",
|
||||
"ConnectionClosedError",
|
||||
"MaxRetryError",
|
||||
"AWSAccessKeyIDInvalidError",
|
||||
"AWSSessionTokenExpiredError",
|
||||
"EndpointConnectionError", # AWS Service is not available in a region
|
||||
"Pool is closed", # The following comes from urllib3: eu-west-1 -- HTTPClientError[126]: An HTTP Client raised an unhandled exception: AWSHTTPSConnectionPool(host='hostname.s3.eu-west-1.amazonaws.com', port=443): Pool is closed.
|
||||
# Authentication Errors from GCP
|
||||
"ClientAuthenticationError",
|
||||
@@ -66,6 +63,8 @@ IGNORED_EXCEPTIONS = [
|
||||
"AzureClientIdAndClientSecretNotBelongingToTenantIdError",
|
||||
"AzureHTTPResponseError",
|
||||
"Error with credentials provided",
|
||||
# AWS Service is not available in a region
|
||||
"EndpointConnectionError",
|
||||
]
|
||||
|
||||
|
||||
@@ -97,6 +96,4 @@ sentry_sdk.init(
|
||||
# possible.
|
||||
"continuous_profiling_auto_start": True,
|
||||
},
|
||||
attach_stacktrace=True,
|
||||
ignore_errors=IGNORED_EXCEPTIONS,
|
||||
)
|
||||
|
||||
@@ -10,7 +10,6 @@ from django.urls import reverse
|
||||
from django_celery_results.models import TaskResult
|
||||
from rest_framework import status
|
||||
from rest_framework.test import APIClient
|
||||
from tasks.jobs.backfill import backfill_resource_scan_summaries
|
||||
|
||||
from api.db_utils import rls_transaction
|
||||
from api.models import (
|
||||
@@ -921,55 +920,6 @@ def integrations_fixture(providers_fixture):
|
||||
return integration1, integration2
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def backfill_scan_metadata_fixture(scans_fixture, findings_fixture):
|
||||
for scan_instance in scans_fixture:
|
||||
tenant_id = scan_instance.tenant_id
|
||||
scan_id = scan_instance.id
|
||||
backfill_resource_scan_summaries(tenant_id=tenant_id, scan_id=scan_id)
|
||||
|
||||
|
||||
@pytest.fixture(scope="function")
|
||||
def latest_scan_finding(authenticated_client, providers_fixture, resources_fixture):
|
||||
provider = providers_fixture[0]
|
||||
tenant_id = str(providers_fixture[0].tenant_id)
|
||||
resource = resources_fixture[0]
|
||||
scan = Scan.objects.create(
|
||||
name="latest completed scan",
|
||||
provider=provider,
|
||||
trigger=Scan.TriggerChoices.MANUAL,
|
||||
state=StateChoices.COMPLETED,
|
||||
tenant_id=tenant_id,
|
||||
)
|
||||
finding = Finding.objects.create(
|
||||
tenant_id=tenant_id,
|
||||
uid="test_finding_uid_1",
|
||||
scan=scan,
|
||||
delta="new",
|
||||
status=Status.FAIL,
|
||||
status_extended="test status extended ",
|
||||
impact=Severity.critical,
|
||||
impact_extended="test impact extended one",
|
||||
severity=Severity.critical,
|
||||
raw_result={
|
||||
"status": Status.FAIL,
|
||||
"impact": Severity.critical,
|
||||
"severity": Severity.critical,
|
||||
},
|
||||
tags={"test": "dev-qa"},
|
||||
check_id="test_check_id",
|
||||
check_metadata={
|
||||
"CheckId": "test_check_id",
|
||||
"Description": "test description apple sauce",
|
||||
},
|
||||
first_seen_at="2024-01-02T00:00:00Z",
|
||||
)
|
||||
|
||||
finding.add_resources([resource])
|
||||
backfill_resource_scan_summaries(tenant_id, str(scan.id))
|
||||
return finding
|
||||
|
||||
|
||||
def get_authorization_header(access_token: str) -> dict:
|
||||
return {"Authorization": f"Bearer {access_token}"}
|
||||
|
||||
|
||||
@@ -1,61 +0,0 @@
|
||||
from api.db_utils import rls_transaction
|
||||
from api.models import (
|
||||
Resource,
|
||||
ResourceFindingMapping,
|
||||
ResourceScanSummary,
|
||||
Scan,
|
||||
StateChoices,
|
||||
)
|
||||
|
||||
|
||||
def backfill_resource_scan_summaries(tenant_id: str, scan_id: str):
|
||||
with rls_transaction(tenant_id):
|
||||
if ResourceScanSummary.objects.filter(
|
||||
tenant_id=tenant_id, scan_id=scan_id
|
||||
).exists():
|
||||
return {"status": "already backfilled"}
|
||||
|
||||
with rls_transaction(tenant_id):
|
||||
if not Scan.objects.filter(
|
||||
tenant_id=tenant_id,
|
||||
id=scan_id,
|
||||
state__in=(StateChoices.COMPLETED, StateChoices.FAILED),
|
||||
).exists():
|
||||
return {"status": "scan is not completed"}
|
||||
|
||||
resource_ids_qs = (
|
||||
ResourceFindingMapping.objects.filter(
|
||||
tenant_id=tenant_id, finding__scan_id=scan_id
|
||||
)
|
||||
.values_list("resource_id", flat=True)
|
||||
.distinct()
|
||||
)
|
||||
|
||||
resource_ids = list(resource_ids_qs)
|
||||
|
||||
if not resource_ids:
|
||||
return {"status": "no resources to backfill"}
|
||||
|
||||
resources_qs = Resource.objects.filter(
|
||||
tenant_id=tenant_id, id__in=resource_ids
|
||||
).only("id", "service", "region", "type")
|
||||
|
||||
summaries = []
|
||||
for resource in resources_qs.iterator():
|
||||
summaries.append(
|
||||
ResourceScanSummary(
|
||||
tenant_id=tenant_id,
|
||||
scan_id=scan_id,
|
||||
resource_id=str(resource.id),
|
||||
service=resource.service,
|
||||
region=resource.region,
|
||||
resource_type=resource.type,
|
||||
)
|
||||
)
|
||||
|
||||
for i in range(0, len(summaries), 500):
|
||||
ResourceScanSummary.objects.bulk_create(
|
||||
summaries[i : i + 500], ignore_conflicts=True
|
||||
)
|
||||
|
||||
return {"status": "backfilled", "inserted": len(summaries)}
|
||||
@@ -13,41 +13,6 @@ from prowler.config.config import (
|
||||
json_ocsf_file_suffix,
|
||||
output_file_timestamp,
|
||||
)
|
||||
from prowler.lib.outputs.compliance.aws_well_architected.aws_well_architected import (
|
||||
AWSWellArchitected,
|
||||
)
|
||||
from prowler.lib.outputs.compliance.cis.cis_aws import AWSCIS
|
||||
from prowler.lib.outputs.compliance.cis.cis_azure import AzureCIS
|
||||
from prowler.lib.outputs.compliance.cis.cis_gcp import GCPCIS
|
||||
from prowler.lib.outputs.compliance.cis.cis_kubernetes import KubernetesCIS
|
||||
from prowler.lib.outputs.compliance.cis.cis_m365 import M365CIS
|
||||
from prowler.lib.outputs.compliance.ens.ens_aws import AWSENS
|
||||
from prowler.lib.outputs.compliance.ens.ens_azure import AzureENS
|
||||
from prowler.lib.outputs.compliance.ens.ens_gcp import GCPENS
|
||||
from prowler.lib.outputs.compliance.iso27001.iso27001_aws import AWSISO27001
|
||||
from prowler.lib.outputs.compliance.iso27001.iso27001_azure import AzureISO27001
|
||||
from prowler.lib.outputs.compliance.iso27001.iso27001_gcp import GCPISO27001
|
||||
from prowler.lib.outputs.compliance.iso27001.iso27001_kubernetes import (
|
||||
KubernetesISO27001,
|
||||
)
|
||||
from prowler.lib.outputs.compliance.kisa_ismsp.kisa_ismsp_aws import AWSKISAISMSP
|
||||
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_aws import AWSMitreAttack
|
||||
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_azure import (
|
||||
AzureMitreAttack,
|
||||
)
|
||||
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_gcp import GCPMitreAttack
|
||||
from prowler.lib.outputs.compliance.prowler_threatscore.prowler_threatscore_aws import (
|
||||
ProwlerThreatScoreAWS,
|
||||
)
|
||||
from prowler.lib.outputs.compliance.prowler_threatscore.prowler_threatscore_azure import (
|
||||
ProwlerThreatScoreAzure,
|
||||
)
|
||||
from prowler.lib.outputs.compliance.prowler_threatscore.prowler_threatscore_gcp import (
|
||||
ProwlerThreatScoreGCP,
|
||||
)
|
||||
from prowler.lib.outputs.compliance.prowler_threatscore.prowler_threatscore_m365 import (
|
||||
ProwlerThreatScoreM365,
|
||||
)
|
||||
from prowler.lib.outputs.csv.csv import CSV
|
||||
from prowler.lib.outputs.html.html import HTML
|
||||
from prowler.lib.outputs.ocsf.ocsf import OCSF
|
||||
@@ -55,44 +20,6 @@ from prowler.lib.outputs.ocsf.ocsf import OCSF
|
||||
logger = get_task_logger(__name__)
|
||||
|
||||
|
||||
COMPLIANCE_CLASS_MAP = {
|
||||
"aws": [
|
||||
(lambda name: name.startswith("cis_"), AWSCIS),
|
||||
(lambda name: name == "mitre_attack_aws", AWSMitreAttack),
|
||||
(lambda name: name.startswith("ens_"), AWSENS),
|
||||
(
|
||||
lambda name: name.startswith("aws_well_architected_framework"),
|
||||
AWSWellArchitected,
|
||||
),
|
||||
(lambda name: name.startswith("iso27001_"), AWSISO27001),
|
||||
(lambda name: name.startswith("kisa"), AWSKISAISMSP),
|
||||
(lambda name: name == "prowler_threatscore_aws", ProwlerThreatScoreAWS),
|
||||
],
|
||||
"azure": [
|
||||
(lambda name: name.startswith("cis_"), AzureCIS),
|
||||
(lambda name: name == "mitre_attack_azure", AzureMitreAttack),
|
||||
(lambda name: name.startswith("ens_"), AzureENS),
|
||||
(lambda name: name.startswith("iso27001_"), AzureISO27001),
|
||||
(lambda name: name == "prowler_threatscore_azure", ProwlerThreatScoreAzure),
|
||||
],
|
||||
"gcp": [
|
||||
(lambda name: name.startswith("cis_"), GCPCIS),
|
||||
(lambda name: name == "mitre_attack_gcp", GCPMitreAttack),
|
||||
(lambda name: name.startswith("ens_"), GCPENS),
|
||||
(lambda name: name.startswith("iso27001_"), GCPISO27001),
|
||||
(lambda name: name == "prowler_threatscore_gcp", ProwlerThreatScoreGCP),
|
||||
],
|
||||
"kubernetes": [
|
||||
(lambda name: name.startswith("cis_"), KubernetesCIS),
|
||||
(lambda name: name.startswith("iso27001_"), KubernetesISO27001),
|
||||
],
|
||||
"m365": [
|
||||
(lambda name: name.startswith("cis_"), M365CIS),
|
||||
(lambda name: name == "prowler_threatscore_m365", ProwlerThreatScoreM365),
|
||||
],
|
||||
}
|
||||
|
||||
|
||||
# Predefined mapping for output formats and their configurations
|
||||
OUTPUT_FORMATS_MAPPING = {
|
||||
"csv": {
|
||||
@@ -116,17 +43,13 @@ def _compress_output_files(output_directory: str) -> str:
|
||||
str: The full path to the newly created ZIP archive.
|
||||
"""
|
||||
zip_path = f"{output_directory}.zip"
|
||||
parent_dir = os.path.dirname(output_directory)
|
||||
zip_path_abs = os.path.abspath(zip_path)
|
||||
|
||||
with zipfile.ZipFile(zip_path, "w", zipfile.ZIP_DEFLATED) as zipf:
|
||||
for foldername, _, filenames in os.walk(parent_dir):
|
||||
for filename in filenames:
|
||||
file_path = os.path.join(foldername, filename)
|
||||
if os.path.abspath(file_path) == zip_path_abs:
|
||||
continue
|
||||
arcname = os.path.relpath(file_path, start=parent_dir)
|
||||
zipf.write(file_path, arcname)
|
||||
for suffix in [config["suffix"] for config in OUTPUT_FORMATS_MAPPING.values()]:
|
||||
zipf.write(
|
||||
f"{output_directory}{suffix}",
|
||||
f"output/{output_directory.split('/')[-1]}{suffix}",
|
||||
)
|
||||
|
||||
return zip_path
|
||||
|
||||
@@ -179,38 +102,25 @@ def _upload_to_s3(tenant_id: str, zip_path: str, scan_id: str) -> str:
|
||||
Raises:
|
||||
botocore.exceptions.ClientError: If the upload attempt to S3 fails for any reason.
|
||||
"""
|
||||
bucket = base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET
|
||||
if not bucket:
|
||||
return None
|
||||
if not base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET:
|
||||
return
|
||||
|
||||
try:
|
||||
s3 = get_s3_client()
|
||||
|
||||
# Upload the ZIP file (outputs) to the S3 bucket
|
||||
zip_key = f"{tenant_id}/{scan_id}/{os.path.basename(zip_path)}"
|
||||
s3_key = f"{tenant_id}/{scan_id}/{os.path.basename(zip_path)}"
|
||||
s3.upload_file(
|
||||
Filename=zip_path,
|
||||
Bucket=bucket,
|
||||
Key=zip_key,
|
||||
Bucket=base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET,
|
||||
Key=s3_key,
|
||||
)
|
||||
|
||||
# Upload the compliance directory to the S3 bucket
|
||||
compliance_dir = os.path.join(os.path.dirname(zip_path), "compliance")
|
||||
for filename in os.listdir(compliance_dir):
|
||||
local_path = os.path.join(compliance_dir, filename)
|
||||
if not os.path.isfile(local_path):
|
||||
continue
|
||||
file_key = f"{tenant_id}/{scan_id}/compliance/{filename}"
|
||||
s3.upload_file(Filename=local_path, Bucket=bucket, Key=file_key)
|
||||
|
||||
return f"s3://{base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET}/{zip_key}"
|
||||
return f"s3://{base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET}/{s3_key}"
|
||||
except (ClientError, NoCredentialsError, ParamValidationError, ValueError) as e:
|
||||
logger.error(f"S3 upload failed: {str(e)}")
|
||||
|
||||
|
||||
def _generate_output_directory(
|
||||
output_directory, prowler_provider: object, tenant_id: str, scan_id: str
|
||||
) -> tuple[str, str]:
|
||||
) -> str:
|
||||
"""
|
||||
Generate a file system path for the output directory of a prowler scan.
|
||||
|
||||
@@ -235,8 +145,7 @@ def _generate_output_directory(
|
||||
|
||||
Example:
|
||||
>>> _generate_output_directory("/tmp", "aws", "tenant-1234", "scan-5678")
|
||||
'/tmp/tenant-1234/aws/scan-5678/prowler-output-2023-02-15T12:34:56',
|
||||
'/tmp/tenant-1234/aws/scan-5678/compliance/prowler-output-2023-02-15T12:34:56'
|
||||
'/tmp/tenant-1234/aws/scan-5678/prowler-output-2023-02-15T12:34:56'
|
||||
"""
|
||||
path = (
|
||||
f"{output_directory}/{tenant_id}/{scan_id}/prowler-output-"
|
||||
@@ -244,10 +153,4 @@ def _generate_output_directory(
|
||||
)
|
||||
os.makedirs("/".join(path.split("/")[:-1]), exist_ok=True)
|
||||
|
||||
compliance_path = (
|
||||
f"{output_directory}/{tenant_id}/{scan_id}/compliance/prowler-output-"
|
||||
f"{prowler_provider}-{output_file_timestamp}"
|
||||
)
|
||||
os.makedirs("/".join(compliance_path.split("/")[:-1]), exist_ok=True)
|
||||
|
||||
return path, compliance_path
|
||||
return path
|
||||
|
||||
@@ -19,7 +19,6 @@ from api.models import (
|
||||
Finding,
|
||||
Provider,
|
||||
Resource,
|
||||
ResourceScanSummary,
|
||||
ResourceTag,
|
||||
Scan,
|
||||
ScanSummary,
|
||||
@@ -122,7 +121,6 @@ def perform_prowler_scan(
|
||||
check_status_by_region = {}
|
||||
exception = None
|
||||
unique_resources = set()
|
||||
scan_resource_cache: set[tuple[str, str, str, str]] = set()
|
||||
start_time = time.time()
|
||||
|
||||
with rls_transaction(tenant_id):
|
||||
@@ -297,16 +295,6 @@ def perform_prowler_scan(
|
||||
continue
|
||||
region_dict[finding.check_id] = finding.status.value
|
||||
|
||||
# Update scan resource summaries
|
||||
scan_resource_cache.add(
|
||||
(
|
||||
str(resource_instance.id),
|
||||
resource_instance.service,
|
||||
resource_instance.region,
|
||||
resource_instance.type,
|
||||
)
|
||||
)
|
||||
|
||||
# Update scan progress
|
||||
with rls_transaction(tenant_id):
|
||||
scan_instance.progress = progress
|
||||
@@ -326,90 +314,66 @@ def perform_prowler_scan(
|
||||
scan_instance.unique_resource_count = len(unique_resources)
|
||||
scan_instance.save()
|
||||
|
||||
if exception is None:
|
||||
try:
|
||||
regions = prowler_provider.get_regions()
|
||||
except AttributeError:
|
||||
regions = set()
|
||||
|
||||
compliance_template = PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE[
|
||||
provider_instance.provider
|
||||
]
|
||||
compliance_overview_by_region = {
|
||||
region: deepcopy(compliance_template) for region in regions
|
||||
}
|
||||
|
||||
for region, check_status in check_status_by_region.items():
|
||||
compliance_data = compliance_overview_by_region.setdefault(
|
||||
region, deepcopy(compliance_template)
|
||||
)
|
||||
for check_name, status in check_status.items():
|
||||
generate_scan_compliance(
|
||||
compliance_data,
|
||||
provider_instance.provider,
|
||||
check_name,
|
||||
status,
|
||||
)
|
||||
|
||||
# Prepare compliance overview objects
|
||||
compliance_overview_objects = []
|
||||
for region, compliance_data in compliance_overview_by_region.items():
|
||||
for compliance_id, compliance in compliance_data.items():
|
||||
compliance_overview_objects.append(
|
||||
ComplianceOverview(
|
||||
tenant_id=tenant_id,
|
||||
scan=scan_instance,
|
||||
region=region,
|
||||
compliance_id=compliance_id,
|
||||
framework=compliance["framework"],
|
||||
version=compliance["version"],
|
||||
description=compliance["description"],
|
||||
requirements=compliance["requirements"],
|
||||
requirements_passed=compliance["requirements_status"]["passed"],
|
||||
requirements_failed=compliance["requirements_status"]["failed"],
|
||||
requirements_manual=compliance["requirements_status"]["manual"],
|
||||
total_requirements=compliance["total_requirements"],
|
||||
)
|
||||
)
|
||||
try:
|
||||
with rls_transaction(tenant_id):
|
||||
ComplianceOverview.objects.bulk_create(
|
||||
compliance_overview_objects, batch_size=100
|
||||
)
|
||||
except Exception as overview_exception:
|
||||
import sentry_sdk
|
||||
|
||||
sentry_sdk.capture_exception(overview_exception)
|
||||
logger.error(
|
||||
f"Error storing compliance overview for scan {scan_id}: {overview_exception}"
|
||||
)
|
||||
if exception is not None:
|
||||
raise exception
|
||||
|
||||
try:
|
||||
regions = prowler_provider.get_regions()
|
||||
except AttributeError:
|
||||
regions = set()
|
||||
|
||||
compliance_template = PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE[
|
||||
provider_instance.provider
|
||||
]
|
||||
compliance_overview_by_region = {
|
||||
region: deepcopy(compliance_template) for region in regions
|
||||
}
|
||||
|
||||
for region, check_status in check_status_by_region.items():
|
||||
compliance_data = compliance_overview_by_region.setdefault(
|
||||
region, deepcopy(compliance_template)
|
||||
)
|
||||
for check_name, status in check_status.items():
|
||||
generate_scan_compliance(
|
||||
compliance_data,
|
||||
provider_instance.provider,
|
||||
check_name,
|
||||
status,
|
||||
)
|
||||
|
||||
# Prepare compliance overview objects
|
||||
compliance_overview_objects = []
|
||||
for region, compliance_data in compliance_overview_by_region.items():
|
||||
for compliance_id, compliance in compliance_data.items():
|
||||
compliance_overview_objects.append(
|
||||
ComplianceOverview(
|
||||
tenant_id=tenant_id,
|
||||
scan=scan_instance,
|
||||
region=region,
|
||||
compliance_id=compliance_id,
|
||||
framework=compliance["framework"],
|
||||
version=compliance["version"],
|
||||
description=compliance["description"],
|
||||
requirements=compliance["requirements"],
|
||||
requirements_passed=compliance["requirements_status"]["passed"],
|
||||
requirements_failed=compliance["requirements_status"]["failed"],
|
||||
requirements_manual=compliance["requirements_status"]["manual"],
|
||||
total_requirements=compliance["total_requirements"],
|
||||
)
|
||||
)
|
||||
try:
|
||||
with rls_transaction(tenant_id):
|
||||
ComplianceOverview.objects.bulk_create(
|
||||
compliance_overview_objects, batch_size=500
|
||||
)
|
||||
except Exception as overview_exception:
|
||||
import sentry_sdk
|
||||
|
||||
sentry_sdk.capture_exception(overview_exception)
|
||||
logger.error(
|
||||
f"Error storing compliance overview for scan {scan_id}: {overview_exception}"
|
||||
)
|
||||
|
||||
try:
|
||||
resource_scan_summaries = [
|
||||
ResourceScanSummary(
|
||||
tenant_id=tenant_id,
|
||||
scan_id=scan_id,
|
||||
resource_id=resource_id,
|
||||
service=service,
|
||||
region=region,
|
||||
resource_type=resource_type,
|
||||
)
|
||||
for resource_id, service, region, resource_type in scan_resource_cache
|
||||
]
|
||||
with rls_transaction(tenant_id):
|
||||
ResourceScanSummary.objects.bulk_create(
|
||||
resource_scan_summaries, batch_size=500, ignore_conflicts=True
|
||||
)
|
||||
except Exception as filter_exception:
|
||||
import sentry_sdk
|
||||
|
||||
sentry_sdk.capture_exception(filter_exception)
|
||||
logger.error(
|
||||
f"Error storing filter values for scan {scan_id}: {filter_exception}"
|
||||
)
|
||||
|
||||
serializer = ScanTaskSerializer(instance=scan_instance)
|
||||
return serializer.data
|
||||
|
||||
|
||||
@@ -7,11 +7,9 @@ from celery.utils.log import get_task_logger
|
||||
from config.celery import RLSTask
|
||||
from config.django.base import DJANGO_FINDINGS_BATCH_SIZE, DJANGO_TMP_OUTPUT_DIRECTORY
|
||||
from django_celery_beat.models import PeriodicTask
|
||||
from tasks.jobs.backfill import backfill_resource_scan_summaries
|
||||
from tasks.jobs.connection import check_provider_connection
|
||||
from tasks.jobs.deletion import delete_provider, delete_tenant
|
||||
from tasks.jobs.export import (
|
||||
COMPLIANCE_CLASS_MAP,
|
||||
OUTPUT_FORMATS_MAPPING,
|
||||
_compress_output_files,
|
||||
_generate_output_directory,
|
||||
@@ -20,14 +18,11 @@ from tasks.jobs.export import (
|
||||
from tasks.jobs.scan import aggregate_findings, perform_prowler_scan
|
||||
from tasks.utils import batched, get_next_execution_datetime
|
||||
|
||||
from api.compliance import get_compliance_frameworks
|
||||
from api.db_utils import rls_transaction
|
||||
from api.decorators import set_tenant
|
||||
from api.models import Finding, Provider, Scan, ScanSummary, StateChoices
|
||||
from api.utils import initialize_prowler_provider
|
||||
from api.v1.serializers import ScanTaskSerializer
|
||||
from prowler.lib.check.compliance_models import Compliance
|
||||
from prowler.lib.outputs.compliance.generic.generic import GenericCompliance
|
||||
from prowler.lib.outputs.finding import Finding as FindingOutput
|
||||
|
||||
logger = get_task_logger(__name__)
|
||||
@@ -256,118 +251,84 @@ def generate_outputs(scan_id: str, provider_id: str, tenant_id: str):
|
||||
logger.info(f"No findings found for scan {scan_id}")
|
||||
return {"upload": False}
|
||||
|
||||
provider_obj = Provider.objects.get(id=provider_id)
|
||||
prowler_provider = initialize_prowler_provider(provider_obj)
|
||||
provider_uid = provider_obj.uid
|
||||
provider_type = provider_obj.provider
|
||||
# Initialize the prowler provider
|
||||
prowler_provider = initialize_prowler_provider(Provider.objects.get(id=provider_id))
|
||||
|
||||
frameworks_bulk = Compliance.get_bulk(provider_type)
|
||||
frameworks_avail = get_compliance_frameworks(provider_type)
|
||||
out_dir, comp_dir = _generate_output_directory(
|
||||
# Get the provider UID
|
||||
provider_uid = Provider.objects.get(id=provider_id).uid
|
||||
|
||||
# Generate and ensure the output directory exists
|
||||
output_directory = _generate_output_directory(
|
||||
DJANGO_TMP_OUTPUT_DIRECTORY, provider_uid, tenant_id, scan_id
|
||||
)
|
||||
|
||||
def get_writer(writer_map, name, factory, is_last):
|
||||
"""
|
||||
Return existing writer_map[name] or create via factory().
|
||||
In both cases set `.close_file = is_last`.
|
||||
"""
|
||||
initialization = False
|
||||
if name not in writer_map:
|
||||
writer_map[name] = factory()
|
||||
initialization = True
|
||||
w = writer_map[name]
|
||||
w.close_file = is_last
|
||||
|
||||
return w, initialization
|
||||
|
||||
# Define auxiliary variables
|
||||
output_writers = {}
|
||||
compliance_writers = {}
|
||||
|
||||
scan_summary = FindingOutput._transform_findings_stats(
|
||||
ScanSummary.objects.filter(scan_id=scan_id)
|
||||
)
|
||||
|
||||
qs = Finding.all_objects.filter(scan_id=scan_id).order_by("uid").iterator()
|
||||
for batch, is_last in batched(qs, DJANGO_FINDINGS_BATCH_SIZE):
|
||||
fos = [FindingOutput.transform_api_finding(f, prowler_provider) for f in batch]
|
||||
# Retrieve findings queryset
|
||||
findings_qs = Finding.all_objects.filter(scan_id=scan_id).order_by("uid")
|
||||
|
||||
# Outputs
|
||||
for mode, cfg in OUTPUT_FORMATS_MAPPING.items():
|
||||
cls = cfg["class"]
|
||||
suffix = cfg["suffix"]
|
||||
extra = cfg.get("kwargs", {}).copy()
|
||||
# Process findings in batches
|
||||
for batch, is_last_batch in batched(
|
||||
findings_qs.iterator(), DJANGO_FINDINGS_BATCH_SIZE
|
||||
):
|
||||
finding_outputs = [
|
||||
FindingOutput.transform_api_finding(finding, prowler_provider)
|
||||
for finding in batch
|
||||
]
|
||||
|
||||
# Generate output files
|
||||
for mode, config in OUTPUT_FORMATS_MAPPING.items():
|
||||
kwargs = dict(config.get("kwargs", {}))
|
||||
if mode == "html":
|
||||
extra.update(provider=prowler_provider, stats=scan_summary)
|
||||
kwargs["provider"] = prowler_provider
|
||||
kwargs["stats"] = scan_summary
|
||||
|
||||
writer, initialization = get_writer(
|
||||
output_writers,
|
||||
cls,
|
||||
lambda cls=cls, fos=fos, suffix=suffix: cls(
|
||||
findings=fos,
|
||||
file_path=out_dir,
|
||||
file_extension=suffix,
|
||||
writer_class = config["class"]
|
||||
if writer_class in output_writers:
|
||||
writer = output_writers[writer_class]
|
||||
writer.transform(finding_outputs)
|
||||
writer.close_file = is_last_batch
|
||||
else:
|
||||
writer = writer_class(
|
||||
findings=finding_outputs,
|
||||
file_path=output_directory,
|
||||
file_extension=config["suffix"],
|
||||
from_cli=False,
|
||||
),
|
||||
is_last,
|
||||
)
|
||||
if not initialization:
|
||||
writer.transform(fos)
|
||||
writer.batch_write_data_to_file(**extra)
|
||||
writer._data.clear()
|
||||
)
|
||||
writer.close_file = is_last_batch
|
||||
output_writers[writer_class] = writer
|
||||
|
||||
# Compliance CSVs
|
||||
for name in frameworks_avail:
|
||||
compliance_obj = frameworks_bulk[name]
|
||||
# Write the current batch using the writer
|
||||
writer.batch_write_data_to_file(**kwargs)
|
||||
|
||||
klass = GenericCompliance
|
||||
for condition, cls in COMPLIANCE_CLASS_MAP.get(provider_type, []):
|
||||
if condition(name):
|
||||
klass = cls
|
||||
break
|
||||
# TODO: Refactor the output classes to avoid this manual reset
|
||||
writer._data = []
|
||||
|
||||
filename = f"{comp_dir}_{name}.csv"
|
||||
# Compress output files
|
||||
output_directory = _compress_output_files(output_directory)
|
||||
|
||||
writer, initialization = get_writer(
|
||||
compliance_writers,
|
||||
name,
|
||||
lambda klass=klass, fos=fos: klass(
|
||||
findings=fos,
|
||||
compliance=compliance_obj,
|
||||
file_path=filename,
|
||||
from_cli=False,
|
||||
),
|
||||
is_last,
|
||||
)
|
||||
if not initialization:
|
||||
writer.transform(fos, compliance_obj, name)
|
||||
writer.batch_write_data_to_file()
|
||||
writer._data.clear()
|
||||
# Save to configured storage
|
||||
uploaded = _upload_to_s3(tenant_id, output_directory, scan_id)
|
||||
|
||||
compressed = _compress_output_files(out_dir)
|
||||
upload_uri = _upload_to_s3(tenant_id, compressed, scan_id)
|
||||
|
||||
if upload_uri:
|
||||
if uploaded:
|
||||
# Remove the local files after upload
|
||||
try:
|
||||
rmtree(Path(compressed).parent, ignore_errors=True)
|
||||
except Exception as e:
|
||||
rmtree(Path(output_directory).parent, ignore_errors=True)
|
||||
except FileNotFoundError as e:
|
||||
logger.error(f"Error deleting output files: {e}")
|
||||
final_location, did_upload = upload_uri, True
|
||||
|
||||
output_directory = uploaded
|
||||
uploaded = True
|
||||
else:
|
||||
final_location, did_upload = compressed, False
|
||||
uploaded = False
|
||||
|
||||
Scan.all_objects.filter(id=scan_id).update(output_location=final_location)
|
||||
logger.info(f"Scan outputs at {final_location}")
|
||||
return {"upload": did_upload}
|
||||
# Update the scan instance with the output path
|
||||
Scan.all_objects.filter(id=scan_id).update(output_location=output_directory)
|
||||
|
||||
logger.info(f"Scan output files generated, output location: {output_directory}")
|
||||
|
||||
@shared_task(name="backfill-scan-resource-summaries", queue="backfill")
|
||||
def backfill_scan_resource_summaries_task(tenant_id: str, scan_id: str):
|
||||
"""
|
||||
Tries to backfill the resource scan summaries table for a given scan.
|
||||
|
||||
Args:
|
||||
tenant_id (str): The tenant identifier.
|
||||
scan_id (str): The scan identifier.
|
||||
"""
|
||||
return backfill_resource_scan_summaries(tenant_id=tenant_id, scan_id=scan_id)
|
||||
return {"upload": uploaded}
|
||||
|
||||
@@ -1,79 +0,0 @@
|
||||
from uuid import uuid4
|
||||
|
||||
import pytest
|
||||
from tasks.jobs.backfill import backfill_resource_scan_summaries
|
||||
|
||||
from api.models import ResourceScanSummary, Scan, StateChoices
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
class TestBackfillResourceScanSummaries:
|
||||
@pytest.fixture(scope="function")
|
||||
def resource_scan_summary_data(self, scans_fixture):
|
||||
scan = scans_fixture[0]
|
||||
return ResourceScanSummary.objects.create(
|
||||
tenant_id=scan.tenant_id,
|
||||
scan_id=scan.id,
|
||||
resource_id=str(uuid4()),
|
||||
service="aws",
|
||||
region="us-east-1",
|
||||
resource_type="instance",
|
||||
)
|
||||
|
||||
@pytest.fixture(scope="function")
|
||||
def get_not_completed_scans(self, providers_fixture):
|
||||
provider_id = providers_fixture[0].id
|
||||
tenant_id = providers_fixture[0].tenant_id
|
||||
scan_1 = Scan.objects.create(
|
||||
tenant_id=tenant_id,
|
||||
trigger=Scan.TriggerChoices.MANUAL,
|
||||
state=StateChoices.EXECUTING,
|
||||
provider_id=provider_id,
|
||||
)
|
||||
scan_2 = Scan.objects.create(
|
||||
tenant_id=tenant_id,
|
||||
trigger=Scan.TriggerChoices.MANUAL,
|
||||
state=StateChoices.AVAILABLE,
|
||||
provider_id=provider_id,
|
||||
)
|
||||
return scan_1, scan_2
|
||||
|
||||
def test_already_backfilled(self, resource_scan_summary_data):
|
||||
tenant_id = resource_scan_summary_data.tenant_id
|
||||
scan_id = resource_scan_summary_data.scan_id
|
||||
|
||||
result = backfill_resource_scan_summaries(tenant_id, scan_id)
|
||||
|
||||
assert result == {"status": "already backfilled"}
|
||||
|
||||
def test_not_completed_scan(self, get_not_completed_scans):
|
||||
for scan_instance in get_not_completed_scans:
|
||||
tenant_id = scan_instance.tenant_id
|
||||
scan_id = scan_instance.id
|
||||
result = backfill_resource_scan_summaries(tenant_id, scan_id)
|
||||
|
||||
assert result == {"status": "scan is not completed"}
|
||||
|
||||
def test_successful_backfill_inserts_one_summary(
|
||||
self, resources_fixture, findings_fixture
|
||||
):
|
||||
tenant_id = findings_fixture[0].tenant_id
|
||||
scan_id = findings_fixture[0].scan_id
|
||||
|
||||
# This scan affects the first two resources
|
||||
resources = resources_fixture[:2]
|
||||
|
||||
result = backfill_resource_scan_summaries(tenant_id, scan_id)
|
||||
assert result == {"status": "backfilled", "inserted": len(resources)}
|
||||
|
||||
# Verify correct values
|
||||
summaries = ResourceScanSummary.objects.filter(
|
||||
tenant_id=tenant_id, scan_id=scan_id
|
||||
)
|
||||
assert summaries.count() == len(resources)
|
||||
for resource in resources:
|
||||
summary = summaries.get(resource_id=resource.id)
|
||||
assert summary.resource_id == resource.id
|
||||
assert summary.service == resource.service
|
||||
assert summary.region == resource.region
|
||||
assert summary.resource_type == resource.type
|
||||
@@ -1,147 +0,0 @@
|
||||
import os
|
||||
import zipfile
|
||||
from pathlib import Path
|
||||
from unittest.mock import MagicMock, patch
|
||||
|
||||
import pytest
|
||||
from botocore.exceptions import ClientError
|
||||
from tasks.jobs.export import (
|
||||
_compress_output_files,
|
||||
_generate_output_directory,
|
||||
_upload_to_s3,
|
||||
get_s3_client,
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
class TestOutputs:
|
||||
def test_compress_output_files_creates_zip(self, tmpdir):
|
||||
base_tmp = Path(str(tmpdir.mkdir("compress_output")))
|
||||
output_dir = base_tmp / "output"
|
||||
output_dir.mkdir()
|
||||
file_path = output_dir / "result.csv"
|
||||
file_path.write_text("data")
|
||||
|
||||
zip_path = _compress_output_files(str(output_dir))
|
||||
|
||||
assert zip_path.endswith(".zip")
|
||||
assert os.path.exists(zip_path)
|
||||
with zipfile.ZipFile(zip_path, "r") as zipf:
|
||||
assert "output/result.csv" in zipf.namelist()
|
||||
|
||||
@patch("tasks.jobs.export.boto3.client")
|
||||
@patch("tasks.jobs.export.settings")
|
||||
def test_get_s3_client_success(self, mock_settings, mock_boto_client):
|
||||
mock_settings.DJANGO_OUTPUT_S3_AWS_ACCESS_KEY_ID = "test"
|
||||
mock_settings.DJANGO_OUTPUT_S3_AWS_SECRET_ACCESS_KEY = "test"
|
||||
mock_settings.DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN = "token"
|
||||
mock_settings.DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION = "eu-west-1"
|
||||
|
||||
client_mock = MagicMock()
|
||||
mock_boto_client.return_value = client_mock
|
||||
|
||||
client = get_s3_client()
|
||||
assert client is not None
|
||||
client_mock.list_buckets.assert_called()
|
||||
|
||||
@patch("tasks.jobs.export.boto3.client")
|
||||
@patch("tasks.jobs.export.settings")
|
||||
def test_get_s3_client_fallback(self, mock_settings, mock_boto_client):
|
||||
mock_boto_client.side_effect = [
|
||||
ClientError({"Error": {"Code": "403"}}, "ListBuckets"),
|
||||
MagicMock(),
|
||||
]
|
||||
client = get_s3_client()
|
||||
assert client is not None
|
||||
|
||||
@patch("tasks.jobs.export.get_s3_client")
|
||||
@patch("tasks.jobs.export.base")
|
||||
def test_upload_to_s3_success(self, mock_base, mock_get_client, tmpdir):
|
||||
mock_base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = "test-bucket"
|
||||
|
||||
base_tmp = Path(str(tmpdir.mkdir("upload_success")))
|
||||
zip_path = base_tmp / "outputs.zip"
|
||||
zip_path.write_bytes(b"dummy")
|
||||
|
||||
compliance_dir = base_tmp / "compliance"
|
||||
compliance_dir.mkdir()
|
||||
(compliance_dir / "report.csv").write_text("ok")
|
||||
|
||||
client_mock = MagicMock()
|
||||
mock_get_client.return_value = client_mock
|
||||
|
||||
result = _upload_to_s3("tenant-id", str(zip_path), "scan-id")
|
||||
|
||||
expected_uri = "s3://test-bucket/tenant-id/scan-id/outputs.zip"
|
||||
assert result == expected_uri
|
||||
assert client_mock.upload_file.call_count == 2
|
||||
|
||||
@patch("tasks.jobs.export.get_s3_client")
|
||||
@patch("tasks.jobs.export.base")
|
||||
def test_upload_to_s3_missing_bucket(self, mock_base, mock_get_client):
|
||||
mock_base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = ""
|
||||
result = _upload_to_s3("tenant", "/tmp/fake.zip", "scan")
|
||||
assert result is None
|
||||
|
||||
@patch("tasks.jobs.export.get_s3_client")
|
||||
@patch("tasks.jobs.export.base")
|
||||
def test_upload_to_s3_skips_non_files(self, mock_base, mock_get_client, tmpdir):
|
||||
mock_base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = "test-bucket"
|
||||
base_tmp = Path(str(tmpdir.mkdir("upload_skips_non_files")))
|
||||
|
||||
zip_path = base_tmp / "results.zip"
|
||||
zip_path.write_bytes(b"zip")
|
||||
|
||||
compliance_dir = base_tmp / "compliance"
|
||||
compliance_dir.mkdir()
|
||||
(compliance_dir / "subdir").mkdir()
|
||||
|
||||
client_mock = MagicMock()
|
||||
mock_get_client.return_value = client_mock
|
||||
|
||||
result = _upload_to_s3("tenant", str(zip_path), "scan")
|
||||
|
||||
expected_uri = "s3://test-bucket/tenant/scan/results.zip"
|
||||
assert result == expected_uri
|
||||
client_mock.upload_file.assert_called_once()
|
||||
|
||||
@patch(
|
||||
"tasks.jobs.export.get_s3_client",
|
||||
side_effect=ClientError({"Error": {}}, "Upload"),
|
||||
)
|
||||
@patch("tasks.jobs.export.base")
|
||||
@patch("tasks.jobs.export.logger.error")
|
||||
def test_upload_to_s3_failure_logs_error(
|
||||
self, mock_logger, mock_base, mock_get_client, tmpdir
|
||||
):
|
||||
mock_base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = "bucket"
|
||||
|
||||
base_tmp = Path(str(tmpdir.mkdir("upload_failure_logs")))
|
||||
zip_path = base_tmp / "zipfile.zip"
|
||||
zip_path.write_bytes(b"zip")
|
||||
|
||||
compliance_dir = base_tmp / "compliance"
|
||||
compliance_dir.mkdir()
|
||||
(compliance_dir / "report.csv").write_text("csv")
|
||||
|
||||
_upload_to_s3("tenant", str(zip_path), "scan")
|
||||
mock_logger.assert_called()
|
||||
|
||||
def test_generate_output_directory_creates_paths(self, tmpdir):
|
||||
from prowler.config.config import output_file_timestamp
|
||||
|
||||
base_tmp = Path(str(tmpdir.mkdir("generate_output")))
|
||||
base_dir = str(base_tmp)
|
||||
tenant_id = "t1"
|
||||
scan_id = "s1"
|
||||
provider = "aws"
|
||||
|
||||
path, compliance = _generate_output_directory(
|
||||
base_dir, provider, tenant_id, scan_id
|
||||
)
|
||||
|
||||
assert os.path.isdir(os.path.dirname(path))
|
||||
assert os.path.isdir(os.path.dirname(compliance))
|
||||
|
||||
assert path.endswith(f"{provider}-{output_file_timestamp}")
|
||||
assert compliance.endswith(f"{provider}-{output_file_timestamp}")
|
||||
@@ -1,415 +0,0 @@
|
||||
import uuid
|
||||
from pathlib import Path
|
||||
from unittest.mock import MagicMock, patch
|
||||
|
||||
import pytest
|
||||
from tasks.tasks import generate_outputs
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
class TestGenerateOutputs:
|
||||
def setup_method(self):
|
||||
self.scan_id = str(uuid.uuid4())
|
||||
self.provider_id = str(uuid.uuid4())
|
||||
self.tenant_id = str(uuid.uuid4())
|
||||
|
||||
def test_no_findings_returns_early(self):
|
||||
with patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter:
|
||||
mock_filter.return_value.exists.return_value = False
|
||||
|
||||
result = generate_outputs(
|
||||
scan_id=self.scan_id,
|
||||
provider_id=self.provider_id,
|
||||
tenant_id=self.tenant_id,
|
||||
)
|
||||
|
||||
assert result == {"upload": False}
|
||||
mock_filter.assert_called_once_with(scan_id=self.scan_id)
|
||||
|
||||
@patch("tasks.tasks.rmtree")
|
||||
@patch("tasks.tasks._upload_to_s3")
|
||||
@patch("tasks.tasks._compress_output_files")
|
||||
@patch("tasks.tasks.get_compliance_frameworks")
|
||||
@patch("tasks.tasks.Compliance.get_bulk")
|
||||
@patch("tasks.tasks.initialize_prowler_provider")
|
||||
@patch("tasks.tasks.Provider.objects.get")
|
||||
@patch("tasks.tasks.ScanSummary.objects.filter")
|
||||
@patch("tasks.tasks.Finding.all_objects.filter")
|
||||
def test_generate_outputs_happy_path(
|
||||
self,
|
||||
mock_finding_filter,
|
||||
mock_scan_summary_filter,
|
||||
mock_provider_get,
|
||||
mock_initialize_provider,
|
||||
mock_compliance_get_bulk,
|
||||
mock_get_available_frameworks,
|
||||
mock_compress,
|
||||
mock_upload,
|
||||
mock_rmtree,
|
||||
):
|
||||
mock_scan_summary_filter.return_value.exists.return_value = True
|
||||
|
||||
mock_provider = MagicMock()
|
||||
mock_provider.uid = "provider-uid"
|
||||
mock_provider.provider = "aws"
|
||||
mock_provider_get.return_value = mock_provider
|
||||
|
||||
prowler_provider = MagicMock()
|
||||
mock_initialize_provider.return_value = prowler_provider
|
||||
|
||||
mock_compliance_get_bulk.return_value = {"cis": MagicMock()}
|
||||
mock_get_available_frameworks.return_value = ["cis"]
|
||||
|
||||
dummy_finding = MagicMock(uid="f1")
|
||||
mock_finding_filter.return_value.order_by.return_value.iterator.return_value = [
|
||||
[dummy_finding],
|
||||
True,
|
||||
]
|
||||
|
||||
mock_transformed_stats = {"some": "stats"}
|
||||
with (
|
||||
patch(
|
||||
"tasks.tasks.FindingOutput._transform_findings_stats",
|
||||
return_value=mock_transformed_stats,
|
||||
),
|
||||
patch(
|
||||
"tasks.tasks.FindingOutput.transform_api_finding",
|
||||
return_value={"transformed": "f1"},
|
||||
),
|
||||
patch(
|
||||
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
|
||||
{
|
||||
"json": {
|
||||
"class": MagicMock(name="JSONWriter"),
|
||||
"suffix": ".json",
|
||||
"kwargs": {},
|
||||
}
|
||||
},
|
||||
),
|
||||
patch(
|
||||
"tasks.tasks.COMPLIANCE_CLASS_MAP",
|
||||
{"aws": [(lambda x: True, MagicMock(name="CSVCompliance"))]},
|
||||
),
|
||||
patch(
|
||||
"tasks.tasks._generate_output_directory",
|
||||
return_value=("out-dir", "comp-dir"),
|
||||
),
|
||||
patch("tasks.tasks.Scan.all_objects.filter") as mock_scan_update,
|
||||
):
|
||||
mock_compress.return_value = "/tmp/zipped.zip"
|
||||
mock_upload.return_value = "s3://bucket/zipped.zip"
|
||||
|
||||
result = generate_outputs(
|
||||
scan_id=self.scan_id,
|
||||
provider_id=self.provider_id,
|
||||
tenant_id=self.tenant_id,
|
||||
)
|
||||
|
||||
assert result == {"upload": True}
|
||||
mock_scan_update.return_value.update.assert_called_once_with(
|
||||
output_location="s3://bucket/zipped.zip"
|
||||
)
|
||||
mock_rmtree.assert_called_once_with(
|
||||
Path("/tmp/zipped.zip").parent, ignore_errors=True
|
||||
)
|
||||
|
||||
def test_generate_outputs_fails_upload(self):
|
||||
with (
|
||||
patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter,
|
||||
patch("tasks.tasks.Provider.objects.get"),
|
||||
patch("tasks.tasks.initialize_prowler_provider"),
|
||||
patch("tasks.tasks.Compliance.get_bulk"),
|
||||
patch("tasks.tasks.get_compliance_frameworks"),
|
||||
patch("tasks.tasks.Finding.all_objects.filter") as mock_findings,
|
||||
patch(
|
||||
"tasks.tasks._generate_output_directory", return_value=("out", "comp")
|
||||
),
|
||||
patch("tasks.tasks.FindingOutput._transform_findings_stats"),
|
||||
patch("tasks.tasks.FindingOutput.transform_api_finding"),
|
||||
patch(
|
||||
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
|
||||
{
|
||||
"json": {
|
||||
"class": MagicMock(name="Writer"),
|
||||
"suffix": ".json",
|
||||
"kwargs": {},
|
||||
}
|
||||
},
|
||||
),
|
||||
patch(
|
||||
"tasks.tasks.COMPLIANCE_CLASS_MAP",
|
||||
{"aws": [(lambda x: True, MagicMock())]},
|
||||
),
|
||||
patch("tasks.tasks._compress_output_files", return_value="/tmp/compressed"),
|
||||
patch("tasks.tasks._upload_to_s3", return_value=None),
|
||||
patch("tasks.tasks.Scan.all_objects.filter") as mock_scan_update,
|
||||
):
|
||||
mock_filter.return_value.exists.return_value = True
|
||||
mock_findings.return_value.order_by.return_value.iterator.return_value = [
|
||||
[MagicMock()],
|
||||
True,
|
||||
]
|
||||
|
||||
result = generate_outputs(
|
||||
scan_id="scan",
|
||||
provider_id="provider",
|
||||
tenant_id=self.tenant_id,
|
||||
)
|
||||
|
||||
assert result == {"upload": False}
|
||||
mock_scan_update.return_value.update.assert_called_once()
|
||||
|
||||
def test_generate_outputs_triggers_html_extra_update(self):
|
||||
mock_finding_output = MagicMock()
|
||||
mock_finding_output.compliance = {"cis": ["requirement-1", "requirement-2"]}
|
||||
|
||||
with (
|
||||
patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter,
|
||||
patch("tasks.tasks.Provider.objects.get"),
|
||||
patch("tasks.tasks.initialize_prowler_provider"),
|
||||
patch("tasks.tasks.Compliance.get_bulk", return_value={"cis": MagicMock()}),
|
||||
patch("tasks.tasks.get_compliance_frameworks", return_value=["cis"]),
|
||||
patch("tasks.tasks.Finding.all_objects.filter") as mock_findings,
|
||||
patch(
|
||||
"tasks.tasks._generate_output_directory", return_value=("out", "comp")
|
||||
),
|
||||
patch(
|
||||
"tasks.tasks.FindingOutput._transform_findings_stats",
|
||||
return_value={"some": "stats"},
|
||||
),
|
||||
patch(
|
||||
"tasks.tasks.FindingOutput.transform_api_finding",
|
||||
return_value=mock_finding_output,
|
||||
),
|
||||
patch("tasks.tasks._compress_output_files", return_value="/tmp/compressed"),
|
||||
patch("tasks.tasks._upload_to_s3", return_value="s3://bucket/f.zip"),
|
||||
patch("tasks.tasks.Scan.all_objects.filter"),
|
||||
):
|
||||
mock_filter.return_value.exists.return_value = True
|
||||
mock_findings.return_value.order_by.return_value.iterator.return_value = [
|
||||
[MagicMock()],
|
||||
True,
|
||||
]
|
||||
|
||||
html_writer_mock = MagicMock()
|
||||
with (
|
||||
patch(
|
||||
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
|
||||
{
|
||||
"html": {
|
||||
"class": lambda *args, **kwargs: html_writer_mock,
|
||||
"suffix": ".html",
|
||||
"kwargs": {},
|
||||
}
|
||||
},
|
||||
),
|
||||
patch(
|
||||
"tasks.tasks.COMPLIANCE_CLASS_MAP",
|
||||
{"aws": [(lambda x: True, MagicMock())]},
|
||||
),
|
||||
):
|
||||
generate_outputs(
|
||||
scan_id=self.scan_id,
|
||||
provider_id=self.provider_id,
|
||||
tenant_id=self.tenant_id,
|
||||
)
|
||||
html_writer_mock.batch_write_data_to_file.assert_called_once()
|
||||
|
||||
def test_transform_called_only_on_second_batch(self):
|
||||
raw1 = MagicMock()
|
||||
raw2 = MagicMock()
|
||||
|
||||
tf1 = MagicMock()
|
||||
tf1.compliance = {}
|
||||
tf2 = MagicMock()
|
||||
tf2.compliance = {}
|
||||
|
||||
writer_instances = []
|
||||
|
||||
class TrackingWriter:
|
||||
def __init__(self, findings, file_path, file_extension, from_cli):
|
||||
self.transform_called = 0
|
||||
self.batch_write_data_to_file = MagicMock()
|
||||
self._data = []
|
||||
self.close_file = False
|
||||
writer_instances.append(self)
|
||||
|
||||
def transform(self, fos):
|
||||
self.transform_called += 1
|
||||
|
||||
with (
|
||||
patch("tasks.tasks.ScanSummary.objects.filter") as mock_summary,
|
||||
patch("tasks.tasks.Provider.objects.get"),
|
||||
patch("tasks.tasks.initialize_prowler_provider"),
|
||||
patch("tasks.tasks.Compliance.get_bulk"),
|
||||
patch("tasks.tasks.get_compliance_frameworks", return_value=[]),
|
||||
patch("tasks.tasks.FindingOutput._transform_findings_stats"),
|
||||
patch(
|
||||
"tasks.tasks.FindingOutput.transform_api_finding",
|
||||
side_effect=[tf1, tf2],
|
||||
),
|
||||
patch(
|
||||
"tasks.tasks._generate_output_directory",
|
||||
return_value=("outdir", "compdir"),
|
||||
),
|
||||
patch("tasks.tasks._compress_output_files", return_value="outdir.zip"),
|
||||
patch("tasks.tasks._upload_to_s3", return_value="s3://bucket/outdir.zip"),
|
||||
patch("tasks.tasks.rmtree"),
|
||||
patch("tasks.tasks.Scan.all_objects.filter"),
|
||||
patch(
|
||||
"tasks.tasks.batched",
|
||||
return_value=[
|
||||
([raw1], False),
|
||||
([raw2], True),
|
||||
],
|
||||
),
|
||||
):
|
||||
mock_summary.return_value.exists.return_value = True
|
||||
|
||||
with patch(
|
||||
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
|
||||
{
|
||||
"json": {
|
||||
"class": TrackingWriter,
|
||||
"suffix": ".json",
|
||||
"kwargs": {},
|
||||
}
|
||||
},
|
||||
):
|
||||
result = generate_outputs(
|
||||
scan_id=self.scan_id,
|
||||
provider_id=self.provider_id,
|
||||
tenant_id=self.tenant_id,
|
||||
)
|
||||
|
||||
assert result == {"upload": True}
|
||||
assert len(writer_instances) == 1
|
||||
writer = writer_instances[0]
|
||||
assert writer.transform_called == 1
|
||||
|
||||
def test_compliance_transform_called_on_second_batch(self):
|
||||
raw1 = MagicMock()
|
||||
raw2 = MagicMock()
|
||||
compliance_obj = MagicMock()
|
||||
writer_instances = []
|
||||
|
||||
class TrackingComplianceWriter:
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.transform_calls = []
|
||||
self._data = []
|
||||
writer_instances.append(self)
|
||||
|
||||
def transform(self, fos, comp_obj, name):
|
||||
self.transform_calls.append((fos, comp_obj, name))
|
||||
|
||||
def batch_write_data_to_file(self):
|
||||
pass
|
||||
|
||||
two_batches = [
|
||||
([raw1], False),
|
||||
([raw2], True),
|
||||
]
|
||||
|
||||
with (
|
||||
patch("tasks.tasks.ScanSummary.objects.filter") as mock_summary,
|
||||
patch(
|
||||
"tasks.tasks.Provider.objects.get",
|
||||
return_value=MagicMock(uid="UID", provider="aws"),
|
||||
),
|
||||
patch("tasks.tasks.initialize_prowler_provider"),
|
||||
patch(
|
||||
"tasks.tasks.Compliance.get_bulk", return_value={"cis": compliance_obj}
|
||||
),
|
||||
patch("tasks.tasks.get_compliance_frameworks", return_value=["cis"]),
|
||||
patch(
|
||||
"tasks.tasks._generate_output_directory",
|
||||
return_value=("outdir", "compdir"),
|
||||
),
|
||||
patch("tasks.tasks.FindingOutput._transform_findings_stats"),
|
||||
patch(
|
||||
"tasks.tasks.FindingOutput.transform_api_finding",
|
||||
side_effect=lambda f, prov: f,
|
||||
),
|
||||
patch("tasks.tasks._compress_output_files", return_value="outdir.zip"),
|
||||
patch("tasks.tasks._upload_to_s3", return_value="s3://bucket/outdir.zip"),
|
||||
patch("tasks.tasks.rmtree"),
|
||||
patch(
|
||||
"tasks.tasks.Scan.all_objects.filter",
|
||||
return_value=MagicMock(update=lambda **kw: None),
|
||||
),
|
||||
patch("tasks.tasks.batched", return_value=two_batches),
|
||||
patch("tasks.tasks.OUTPUT_FORMATS_MAPPING", {}),
|
||||
patch(
|
||||
"tasks.tasks.COMPLIANCE_CLASS_MAP",
|
||||
{"aws": [(lambda name: True, TrackingComplianceWriter)]},
|
||||
),
|
||||
):
|
||||
mock_summary.return_value.exists.return_value = True
|
||||
|
||||
result = generate_outputs(
|
||||
scan_id=self.scan_id,
|
||||
provider_id=self.provider_id,
|
||||
tenant_id=self.tenant_id,
|
||||
)
|
||||
|
||||
assert len(writer_instances) == 1
|
||||
writer = writer_instances[0]
|
||||
assert writer.transform_calls == [([raw2], compliance_obj, "cis")]
|
||||
assert result == {"upload": True}
|
||||
|
||||
def test_generate_outputs_logs_rmtree_exception(self, caplog):
|
||||
mock_finding_output = MagicMock()
|
||||
mock_finding_output.compliance = {"cis": ["requirement-1", "requirement-2"]}
|
||||
|
||||
with (
|
||||
patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter,
|
||||
patch("tasks.tasks.Provider.objects.get"),
|
||||
patch("tasks.tasks.initialize_prowler_provider"),
|
||||
patch("tasks.tasks.Compliance.get_bulk", return_value={"cis": MagicMock()}),
|
||||
patch("tasks.tasks.get_compliance_frameworks", return_value=["cis"]),
|
||||
patch("tasks.tasks.Finding.all_objects.filter") as mock_findings,
|
||||
patch(
|
||||
"tasks.tasks._generate_output_directory", return_value=("out", "comp")
|
||||
),
|
||||
patch(
|
||||
"tasks.tasks.FindingOutput._transform_findings_stats",
|
||||
return_value={"some": "stats"},
|
||||
),
|
||||
patch(
|
||||
"tasks.tasks.FindingOutput.transform_api_finding",
|
||||
return_value=mock_finding_output,
|
||||
),
|
||||
patch("tasks.tasks._compress_output_files", return_value="/tmp/compressed"),
|
||||
patch("tasks.tasks._upload_to_s3", return_value="s3://bucket/file.zip"),
|
||||
patch("tasks.tasks.Scan.all_objects.filter"),
|
||||
patch("tasks.tasks.rmtree", side_effect=Exception("Test deletion error")),
|
||||
):
|
||||
mock_filter.return_value.exists.return_value = True
|
||||
mock_findings.return_value.order_by.return_value.iterator.return_value = [
|
||||
[MagicMock()],
|
||||
True,
|
||||
]
|
||||
|
||||
with (
|
||||
patch(
|
||||
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
|
||||
{
|
||||
"json": {
|
||||
"class": lambda *args, **kwargs: MagicMock(),
|
||||
"suffix": ".json",
|
||||
"kwargs": {},
|
||||
}
|
||||
},
|
||||
),
|
||||
patch(
|
||||
"tasks.tasks.COMPLIANCE_CLASS_MAP",
|
||||
{"aws": [(lambda x: True, MagicMock())]},
|
||||
),
|
||||
):
|
||||
with caplog.at_level("ERROR"):
|
||||
generate_outputs(
|
||||
scan_id=self.scan_id,
|
||||
provider_id=self.provider_id,
|
||||
tenant_id=self.tenant_id,
|
||||
)
|
||||
assert "Error deleting output files" in caplog.text
|
||||
@@ -1,156 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
import argparse
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
import matplotlib.pyplot as plt
|
||||
import pandas as pd
|
||||
|
||||
plt.style.use("ggplot")
|
||||
|
||||
|
||||
def run_locust(
|
||||
locust_file: str,
|
||||
host: str,
|
||||
users: int,
|
||||
hatch_rate: int,
|
||||
run_time: str,
|
||||
csv_prefix: Path,
|
||||
) -> Path:
|
||||
artifacts_dir = Path("artifacts")
|
||||
artifacts_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
cmd = [
|
||||
"locust",
|
||||
"-f",
|
||||
f"scenarios/{locust_file}",
|
||||
"--headless",
|
||||
"-u",
|
||||
str(users),
|
||||
"-r",
|
||||
str(hatch_rate),
|
||||
"-t",
|
||||
run_time,
|
||||
"--host",
|
||||
host,
|
||||
"--csv",
|
||||
str(artifacts_dir / csv_prefix.name),
|
||||
]
|
||||
print(f"Running Locust: {' '.join(cmd)}")
|
||||
process = subprocess.run(cmd)
|
||||
if process.returncode:
|
||||
sys.exit("Locust execution failed")
|
||||
|
||||
stats_file = artifacts_dir / f"{csv_prefix.stem}_stats.csv"
|
||||
if not stats_file.exists():
|
||||
sys.exit(f"Stats CSV not found: {stats_file}")
|
||||
return stats_file
|
||||
|
||||
|
||||
def load_percentiles(csv_path: Path) -> pd.DataFrame:
|
||||
df = pd.read_csv(csv_path)
|
||||
mapping = {"50%": "p50", "75%": "p75", "90%": "p90", "95%": "p95"}
|
||||
available = [col for col in mapping if col in df.columns]
|
||||
renamed = {col: mapping[col] for col in available}
|
||||
df = df.rename(columns=renamed).set_index("Name")[renamed.values()]
|
||||
return df.drop(index=["Aggregated"], errors="ignore")
|
||||
|
||||
|
||||
def sanitize_label(label: str) -> str:
|
||||
text = re.sub(r"[^\w]+", "_", label.strip().lower())
|
||||
return text.strip("_")
|
||||
|
||||
|
||||
def plot_multi_comparison(metrics: dict[str, pd.DataFrame]) -> None:
|
||||
common = sorted(set.intersection(*(set(df.index) for df in metrics.values())))
|
||||
percentiles = list(next(iter(metrics.values())).columns)
|
||||
groups = len(metrics)
|
||||
width = 0.8 / groups
|
||||
|
||||
for endpoint in common:
|
||||
fig, ax = plt.subplots(figsize=(10, 5), dpi=100)
|
||||
for idx, (label, df) in enumerate(metrics.items()):
|
||||
series = df.loc[endpoint]
|
||||
positions = [
|
||||
i + (idx - groups / 2) * width + width / 2
|
||||
for i in range(len(percentiles))
|
||||
]
|
||||
bars = ax.bar(positions, series.values, width, label=label)
|
||||
for bar in bars:
|
||||
height = bar.get_height()
|
||||
ax.annotate(
|
||||
f"{int(height)}",
|
||||
xy=(bar.get_x() + bar.get_width() / 2, height),
|
||||
xytext=(0, 3),
|
||||
textcoords="offset points",
|
||||
ha="center",
|
||||
va="bottom",
|
||||
fontsize=8,
|
||||
)
|
||||
|
||||
ax.set_xticks(range(len(percentiles)))
|
||||
ax.set_xticklabels(percentiles)
|
||||
ax.set_ylabel("Latency (ms)")
|
||||
ax.set_title(endpoint, fontsize=12)
|
||||
ax.grid(True, axis="y", linestyle="--", alpha=0.7)
|
||||
|
||||
fig.tight_layout()
|
||||
fig.subplots_adjust(right=0.75)
|
||||
ax.legend(loc="center left", bbox_to_anchor=(1, 0.5), framealpha=0.9)
|
||||
|
||||
output = Path("artifacts") / f"comparison_{sanitize_label(endpoint)}.png"
|
||||
plt.savefig(output)
|
||||
plt.close(fig)
|
||||
print(f"Saved chart: {output}")
|
||||
|
||||
|
||||
def main() -> None:
|
||||
parser = argparse.ArgumentParser(description="Run Locust and compare metrics")
|
||||
parser.add_argument("--locustfile", required=True, help="Locust file in scenarios/")
|
||||
parser.add_argument("--host", required=True, help="Target host URL")
|
||||
parser.add_argument(
|
||||
"--users", type=int, default=10, help="Number of simulated users"
|
||||
)
|
||||
parser.add_argument("--rate", type=int, default=1, help="Hatch rate per second")
|
||||
parser.add_argument("--time", default="1m", help="Test duration (e.g. 30s, 1m)")
|
||||
parser.add_argument(
|
||||
"--metrics-dir", default="baselines", help="Directory with CSV baselines"
|
||||
)
|
||||
parser.add_argument("--version", default="current", help="Test version")
|
||||
args = parser.parse_args()
|
||||
|
||||
metrics_dir = Path(args.metrics_dir)
|
||||
os.makedirs(metrics_dir, exist_ok=True)
|
||||
|
||||
metrics_data: dict[str, pd.DataFrame] = {}
|
||||
for csv_file in sorted(metrics_dir.glob("*.csv")):
|
||||
metrics_data[csv_file.stem] = load_percentiles(csv_file)
|
||||
|
||||
current_prefix = Path(args.version)
|
||||
current_csv = run_locust(
|
||||
locust_file=args.locustfile,
|
||||
host=args.host,
|
||||
users=args.users,
|
||||
hatch_rate=args.rate,
|
||||
run_time=args.time,
|
||||
csv_prefix=current_prefix,
|
||||
)
|
||||
metrics_data[args.version] = load_percentiles(current_csv)
|
||||
|
||||
for endpoint in sorted(
|
||||
set.intersection(*(set(df.index) for df in metrics_data.values()))
|
||||
):
|
||||
parts = [endpoint]
|
||||
for label, df in metrics_data.items():
|
||||
s = df.loc[endpoint]
|
||||
parts.append(f"{label}: p50 {s.p50}, p75 {s.p75}, p90 {s.p90}, p95 {s.p95}")
|
||||
print(" | ".join(parts))
|
||||
|
||||
plot_multi_comparison(metrics_data)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -1,3 +0,0 @@
|
||||
locust==2.34.1
|
||||
matplotlib==3.10.1
|
||||
pandas==2.2.3
|
||||
@@ -1,216 +0,0 @@
|
||||
from locust import events, task
|
||||
from utils.config import (
|
||||
FINDINGS_UI_SORT_VALUES,
|
||||
L_PROVIDER_NAME,
|
||||
M_PROVIDER_NAME,
|
||||
S_PROVIDER_NAME,
|
||||
TARGET_INSERTED_AT,
|
||||
)
|
||||
from utils.helpers import (
|
||||
APIUserBase,
|
||||
get_api_token,
|
||||
get_auth_headers,
|
||||
get_next_resource_filter,
|
||||
get_resource_filters_pairs,
|
||||
get_scan_id_from_provider_name,
|
||||
get_sort_value,
|
||||
)
|
||||
|
||||
GLOBAL = {
|
||||
"token": None,
|
||||
"scan_ids": {},
|
||||
"resource_filters": None,
|
||||
"large_resource_filters": None,
|
||||
}
|
||||
|
||||
|
||||
@events.test_start.add_listener
|
||||
def on_test_start(environment, **kwargs):
|
||||
GLOBAL["token"] = get_api_token(environment.host)
|
||||
|
||||
GLOBAL["scan_ids"]["small"] = get_scan_id_from_provider_name(
|
||||
environment.host, GLOBAL["token"], S_PROVIDER_NAME
|
||||
)
|
||||
GLOBAL["scan_ids"]["medium"] = get_scan_id_from_provider_name(
|
||||
environment.host, GLOBAL["token"], M_PROVIDER_NAME
|
||||
)
|
||||
GLOBAL["scan_ids"]["large"] = get_scan_id_from_provider_name(
|
||||
environment.host, GLOBAL["token"], L_PROVIDER_NAME
|
||||
)
|
||||
|
||||
GLOBAL["resource_filters"] = get_resource_filters_pairs(
|
||||
environment.host, GLOBAL["token"]
|
||||
)
|
||||
GLOBAL["large_resource_filters"] = get_resource_filters_pairs(
|
||||
environment.host, GLOBAL["token"], GLOBAL["scan_ids"]["large"]
|
||||
)
|
||||
|
||||
|
||||
class APIUser(APIUserBase):
|
||||
def on_start(self):
|
||||
self.token = GLOBAL["token"]
|
||||
self.s_scan_id = GLOBAL["scan_ids"]["small"]
|
||||
self.m_scan_id = GLOBAL["scan_ids"]["medium"]
|
||||
self.l_scan_id = GLOBAL["scan_ids"]["large"]
|
||||
self.available_resource_filters = GLOBAL["resource_filters"]
|
||||
self.available_resource_filters_large_scan = GLOBAL["large_resource_filters"]
|
||||
|
||||
@task
|
||||
def findings_default(self):
|
||||
name = "/findings"
|
||||
page_number = self._next_page(name)
|
||||
endpoint = (
|
||||
f"/findings?page[number]={page_number}"
|
||||
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
|
||||
f"&filter[inserted_at]={TARGET_INSERTED_AT}"
|
||||
)
|
||||
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
|
||||
|
||||
@task(3)
|
||||
def findings_default_include(self):
|
||||
name = "/findings?include"
|
||||
page = self._next_page(name)
|
||||
endpoint = (
|
||||
f"/findings?page[number]={page}"
|
||||
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
|
||||
f"&filter[inserted_at]={TARGET_INSERTED_AT}"
|
||||
f"&include=scan.provider,resources"
|
||||
)
|
||||
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
|
||||
|
||||
@task(3)
|
||||
def findings_metadata(self):
|
||||
endpoint = f"/findings/metadata?" f"filter[inserted_at]={TARGET_INSERTED_AT}"
|
||||
self.client.get(
|
||||
endpoint, headers=get_auth_headers(self.token), name="/findings/metadata"
|
||||
)
|
||||
|
||||
@task
|
||||
def findings_scan_small(self):
|
||||
name = "/findings?filter[scan_id] - 50k"
|
||||
page_number = self._next_page(name)
|
||||
endpoint = (
|
||||
f"/findings?page[number]={page_number}"
|
||||
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
|
||||
f"&filter[scan]={self.s_scan_id}"
|
||||
)
|
||||
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
|
||||
|
||||
@task
|
||||
def findings_metadata_scan_small(self):
|
||||
endpoint = f"/findings/metadata?" f"&filter[scan]={self.s_scan_id}"
|
||||
self.client.get(
|
||||
endpoint,
|
||||
headers=get_auth_headers(self.token),
|
||||
name="/findings/metadata?filter[scan_id] - 50k",
|
||||
)
|
||||
|
||||
@task(2)
|
||||
def findings_scan_medium(self):
|
||||
name = "/findings?filter[scan_id] - 250k"
|
||||
page_number = self._next_page(name)
|
||||
endpoint = (
|
||||
f"/findings?page[number]={page_number}"
|
||||
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
|
||||
f"&filter[scan]={self.m_scan_id}"
|
||||
)
|
||||
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
|
||||
|
||||
@task
|
||||
def findings_metadata_scan_medium(self):
|
||||
endpoint = f"/findings/metadata?" f"&filter[scan]={self.m_scan_id}"
|
||||
self.client.get(
|
||||
endpoint,
|
||||
headers=get_auth_headers(self.token),
|
||||
name="/findings/metadata?filter[scan_id] - 250k",
|
||||
)
|
||||
|
||||
@task
|
||||
def findings_scan_large(self):
|
||||
name = "/findings?filter[scan_id] - 500k"
|
||||
page_number = self._next_page(name)
|
||||
endpoint = (
|
||||
f"/findings?page[number]={page_number}"
|
||||
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
|
||||
f"&filter[scan]={self.l_scan_id}"
|
||||
)
|
||||
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
|
||||
|
||||
@task
|
||||
def findings_scan_large_include(self):
|
||||
name = "/findings?filter[scan_id]&include - 500k"
|
||||
page_number = self._next_page(name)
|
||||
endpoint = (
|
||||
f"/findings?page[number]={page_number}"
|
||||
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
|
||||
f"&filter[scan]={self.l_scan_id}"
|
||||
f"&include=scan.provider,resources"
|
||||
)
|
||||
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
|
||||
|
||||
@task
|
||||
def findings_metadata_scan_large(self):
|
||||
endpoint = f"/findings/metadata?" f"&filter[scan]={self.l_scan_id}"
|
||||
self.client.get(
|
||||
endpoint,
|
||||
headers=get_auth_headers(self.token),
|
||||
name="/findings/metadata?filter[scan_id] - 500k",
|
||||
)
|
||||
|
||||
@task(2)
|
||||
def findings_resource_filter(self):
|
||||
name = "/findings?filter[resource_filter]&include"
|
||||
filter_name, filter_value = get_next_resource_filter(
|
||||
self.available_resource_filters
|
||||
)
|
||||
|
||||
endpoint = (
|
||||
f"/findings?filter[{filter_name}]={filter_value}"
|
||||
f"&filter[inserted_at]={TARGET_INSERTED_AT}"
|
||||
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
|
||||
f"&include=scan.provider,resources"
|
||||
)
|
||||
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
|
||||
|
||||
@task(3)
|
||||
def findings_metadata_resource_filter(self):
|
||||
name = "/findings/metadata?filter[resource_filter]"
|
||||
filter_name, filter_value = get_next_resource_filter(
|
||||
self.available_resource_filters
|
||||
)
|
||||
|
||||
endpoint = (
|
||||
f"/findings/metadata?filter[{filter_name}]={filter_value}"
|
||||
f"&filter[inserted_at]={TARGET_INSERTED_AT}"
|
||||
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
|
||||
)
|
||||
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
|
||||
|
||||
@task(3)
|
||||
def findings_metadata_resource_filter_scan_large(self):
|
||||
name = "/findings/metadata?filter[resource_filter]&filter[scan_id] - 500k"
|
||||
filter_name, filter_value = get_next_resource_filter(
|
||||
self.available_resource_filters
|
||||
)
|
||||
|
||||
endpoint = (
|
||||
f"/findings/metadata?filter[{filter_name}]={filter_value}"
|
||||
f"&filter[scan]={self.l_scan_id}"
|
||||
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
|
||||
)
|
||||
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
|
||||
|
||||
@task(2)
|
||||
def findings_resource_filter_large_scan_include(self):
|
||||
name = "/findings?filter[resource_filter][scan]&include - 500k"
|
||||
filter_name, filter_value = get_next_resource_filter(
|
||||
self.available_resource_filters
|
||||
)
|
||||
|
||||
endpoint = (
|
||||
f"/findings?filter[{filter_name}]={filter_value}"
|
||||
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
|
||||
f"&filter[scan]={self.l_scan_id}"
|
||||
f"&include=scan.provider,resources"
|
||||
)
|
||||
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
|
||||
@@ -1,19 +0,0 @@
|
||||
import os
|
||||
|
||||
USER_EMAIL = os.environ.get("USER_EMAIL")
|
||||
USER_PASSWORD = os.environ.get("USER_PASSWORD")
|
||||
|
||||
BASE_HEADERS = {"Content-Type": "application/vnd.api+json"}
|
||||
|
||||
FINDINGS_UI_SORT_VALUES = ["severity", "status", "-inserted_at"]
|
||||
TARGET_INSERTED_AT = os.environ.get("TARGET_INSERTED_AT", "2025-04-22")
|
||||
|
||||
FINDINGS_RESOURCE_METADATA = {
|
||||
"regions": "region",
|
||||
"resource_types": "resource_type",
|
||||
"services": "service",
|
||||
}
|
||||
|
||||
S_PROVIDER_NAME = "provider-50k"
|
||||
M_PROVIDER_NAME = "provider-250k"
|
||||
L_PROVIDER_NAME = "provider-500k"
|
||||
@@ -1,168 +0,0 @@
|
||||
import random
|
||||
from collections import defaultdict
|
||||
from threading import Lock
|
||||
|
||||
import requests
|
||||
from locust import HttpUser, between
|
||||
from utils.config import (
|
||||
BASE_HEADERS,
|
||||
FINDINGS_RESOURCE_METADATA,
|
||||
TARGET_INSERTED_AT,
|
||||
USER_EMAIL,
|
||||
USER_PASSWORD,
|
||||
)
|
||||
|
||||
_global_page_counters = defaultdict(int)
|
||||
_page_lock = Lock()
|
||||
|
||||
|
||||
class APIUserBase(HttpUser):
|
||||
"""
|
||||
Base class for API user simulation in Locust performance tests.
|
||||
|
||||
Attributes:
|
||||
abstract (bool): Indicates this is an abstract user class.
|
||||
wait_time: Time between task executions, randomized between 1 and 5 seconds.
|
||||
"""
|
||||
|
||||
abstract = True
|
||||
wait_time = between(1, 5)
|
||||
|
||||
def _next_page(self, endpoint_name: str) -> int:
|
||||
"""
|
||||
Returns the next page number for a given endpoint. Thread-safe.
|
||||
|
||||
Args:
|
||||
endpoint_name (str): Name of the API endpoint being paginated.
|
||||
|
||||
Returns:
|
||||
int: The next page number for the given endpoint.
|
||||
"""
|
||||
with _page_lock:
|
||||
_global_page_counters[endpoint_name] += 1
|
||||
return _global_page_counters[endpoint_name]
|
||||
|
||||
|
||||
def get_next_resource_filter(available_values: dict) -> tuple:
|
||||
"""
|
||||
Randomly selects a filter type and value from available options.
|
||||
|
||||
Args:
|
||||
available_values (dict): Dictionary with filter types as keys and list of possible values.
|
||||
|
||||
Returns:
|
||||
tuple: A (filter_type, filter_value) pair randomly selected.
|
||||
"""
|
||||
filter_type = random.choice(list(available_values.keys()))
|
||||
filter_value = random.choice(available_values[filter_type])
|
||||
return filter_type, filter_value
|
||||
|
||||
|
||||
def get_auth_headers(token: str) -> dict:
|
||||
"""
|
||||
Returns the headers for the API requests.
|
||||
|
||||
Args:
|
||||
token (str): The token to be included in the headers.
|
||||
|
||||
Returns:
|
||||
dict: The headers for the API requests.
|
||||
"""
|
||||
return {
|
||||
"Authorization": f"Bearer {token}",
|
||||
**BASE_HEADERS,
|
||||
}
|
||||
|
||||
|
||||
def get_api_token(host: str) -> str:
|
||||
"""
|
||||
Authenticates with the API and retrieves a bearer token.
|
||||
|
||||
Args:
|
||||
host (str): The host URL of the API.
|
||||
|
||||
Returns:
|
||||
str: The access token for authenticated requests.
|
||||
|
||||
Raises:
|
||||
AssertionError: If the request fails or does not return a 200 status code.
|
||||
"""
|
||||
login_payload = {
|
||||
"data": {
|
||||
"type": "tokens",
|
||||
"attributes": {"email": USER_EMAIL, "password": USER_PASSWORD},
|
||||
}
|
||||
}
|
||||
response = requests.post(f"{host}/tokens", json=login_payload, headers=BASE_HEADERS)
|
||||
assert response.status_code == 200, f"Failed to get token: {response.text}"
|
||||
return response.json()["data"]["attributes"]["access"]
|
||||
|
||||
|
||||
def get_scan_id_from_provider_name(host: str, token: str, provider_name: str) -> str:
|
||||
"""
|
||||
Retrieves the scan ID associated with a specific provider name.
|
||||
|
||||
Args:
|
||||
host (str): The host URL of the API.
|
||||
token (str): Bearer token for authentication.
|
||||
provider_name (str): Name of the provider to filter scans by.
|
||||
|
||||
Returns:
|
||||
str: The ID of the scan.
|
||||
|
||||
Raises:
|
||||
AssertionError: If the request fails or does not return a 200 status code.
|
||||
"""
|
||||
response = requests.get(
|
||||
f"{host}/scans?fields[scans]=id&filter[provider_alias]={provider_name}",
|
||||
headers=get_auth_headers(token),
|
||||
)
|
||||
assert response.status_code == 200, f"Failed to get scan: {response.text}"
|
||||
return response.json()["data"][0]["id"]
|
||||
|
||||
|
||||
def get_resource_filters_pairs(host: str, token: str, scan_id: str = "") -> dict:
|
||||
"""
|
||||
Retrieves and maps resource metadata filter values from the findings endpoint.
|
||||
|
||||
Args:
|
||||
host (str): The host URL of the API.
|
||||
token (str): Bearer token for authentication.
|
||||
scan_id (str, optional): Optional scan ID to filter metadata. Defaults to using inserted_at timestamp.
|
||||
|
||||
Returns:
|
||||
dict: A dictionary of resource filter metadata.
|
||||
|
||||
Raises:
|
||||
AssertionError: If the request fails or does not return a 200 status code.
|
||||
"""
|
||||
metadata_filters = (
|
||||
f"filter[scan]={scan_id}"
|
||||
if scan_id
|
||||
else f"filter[inserted_at]={TARGET_INSERTED_AT}"
|
||||
)
|
||||
response = requests.get(
|
||||
f"{host}/findings/metadata?{metadata_filters}", headers=get_auth_headers(token)
|
||||
)
|
||||
assert (
|
||||
response.status_code == 200
|
||||
), f"Failed to get resource filters values: {response.text}"
|
||||
attributes = response.json()["data"]["attributes"]
|
||||
return {
|
||||
FINDINGS_RESOURCE_METADATA[key]: values
|
||||
for key, values in attributes.items()
|
||||
if key in FINDINGS_RESOURCE_METADATA.keys()
|
||||
}
|
||||
|
||||
|
||||
def get_sort_value(sort_values: list) -> str:
|
||||
"""
|
||||
Constructs a sort query string from a list of sort keys.
|
||||
|
||||
Args:
|
||||
sort_values (list): The list of sort values to include in the query.
|
||||
|
||||
Returns:
|
||||
str: A formatted sort query string (e.g., "sort=created_at,-severity").
|
||||
"""
|
||||
return f"sort={','.join(sort_values)}"
|
||||
@@ -1,9 +1,6 @@
|
||||
#!/bin/bash
|
||||
# Run Prowler against All AWS Accounts in an AWS Organization
|
||||
|
||||
# Activate Poetry Environment
|
||||
eval "$(poetry env activate)"
|
||||
|
||||
# Show Prowler Version
|
||||
prowler -v
|
||||
|
||||
|
||||
@@ -16,6 +16,7 @@ spec:
|
||||
containers:
|
||||
- name: prowler
|
||||
image: {{ .Values.image.repository }}:{{ .Values.image.tag }}
|
||||
command: ["prowler"]
|
||||
args: ["kubernetes", "-z", "-b"]
|
||||
imagePullPolicy: {{ .Values.image.pullPolicy }}
|
||||
volumeMounts:
|
||||
|
||||
@@ -161,7 +161,7 @@ def update_nav_bar(pathname):
|
||||
html.Span(
|
||||
[
|
||||
html.Img(src="assets/favicon.ico", className="w-5"),
|
||||
"Subscribe to Prowler Cloud",
|
||||
"Subscribe to prowler SaaS",
|
||||
],
|
||||
className="flex items-center gap-x-3 text-white",
|
||||
),
|
||||
|
||||
|
Before Width: | Height: | Size: 68 KiB |
5
dashboard/assets/styles/dist/output.css
vendored
@@ -1361,9 +1361,6 @@ video {
|
||||
.lg\:grid-cols-4 {
|
||||
grid-template-columns: repeat(4, minmax(0, 1fr));
|
||||
}
|
||||
.lg\:grid-cols-5 {
|
||||
grid-template-columns: repeat(5, minmax(0, 1fr));
|
||||
}
|
||||
|
||||
.lg\:justify-normal {
|
||||
justify-content: normal;
|
||||
@@ -1406,4 +1403,4 @@ video {
|
||||
.\32xl\:w-\[9\%\] {
|
||||
width: 9%;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -2228,356 +2228,13 @@ def get_section_containers_ens(data, section_1, section_2, section_3, section_4)
|
||||
return html.Div(section_containers, className="compliance-data-layout")
|
||||
|
||||
|
||||
def get_section_containers_3_levels(data, section_1, section_2, section_3):
|
||||
data["STATUS"] = data["STATUS"].apply(map_status_to_icon)
|
||||
findings_counts_marco = (
|
||||
data.groupby([section_1, "STATUS"]).size().unstack(fill_value=0)
|
||||
)
|
||||
section_containers = []
|
||||
data[section_1] = data[section_1].astype(str)
|
||||
data[section_2] = data[section_2].astype(str)
|
||||
data[section_3] = data[section_3].astype(str)
|
||||
|
||||
data.sort_values(
|
||||
by=section_3,
|
||||
key=lambda x: x.map(extract_numeric_values),
|
||||
ascending=True,
|
||||
inplace=True,
|
||||
)
|
||||
|
||||
for marco in data[section_1].unique():
|
||||
success_marco = findings_counts_marco.loc[marco].get(pass_emoji, 0)
|
||||
failed_marco = findings_counts_marco.loc[marco].get(fail_emoji, 0)
|
||||
|
||||
fig_name = go.Figure(
|
||||
[
|
||||
go.Bar(
|
||||
name="Failed",
|
||||
x=[failed_marco],
|
||||
y=[""],
|
||||
orientation="h",
|
||||
marker=dict(color="#e77676"),
|
||||
width=[0.8],
|
||||
),
|
||||
go.Bar(
|
||||
name="Success",
|
||||
x=[success_marco],
|
||||
y=[""],
|
||||
orientation="h",
|
||||
marker=dict(color="#45cc6e"),
|
||||
width=[0.8],
|
||||
),
|
||||
]
|
||||
)
|
||||
fig_name.update_layout(
|
||||
barmode="stack",
|
||||
margin=dict(l=10, r=10, t=10, b=10),
|
||||
paper_bgcolor="rgba(0,0,0,0)",
|
||||
plot_bgcolor="rgba(0,0,0,0)",
|
||||
showlegend=False,
|
||||
width=350,
|
||||
height=30,
|
||||
xaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
|
||||
yaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
|
||||
annotations=[
|
||||
dict(
|
||||
x=success_marco + failed_marco,
|
||||
y=0,
|
||||
xref="x",
|
||||
yref="y",
|
||||
text=str(success_marco),
|
||||
showarrow=False,
|
||||
font=dict(color="#45cc6e", size=14),
|
||||
xanchor="left",
|
||||
yanchor="middle",
|
||||
),
|
||||
dict(
|
||||
x=0,
|
||||
y=0,
|
||||
xref="x",
|
||||
yref="y",
|
||||
text=str(failed_marco),
|
||||
showarrow=False,
|
||||
font=dict(color="#e77676", size=14),
|
||||
xanchor="right",
|
||||
yanchor="middle",
|
||||
),
|
||||
],
|
||||
)
|
||||
fig_name.add_annotation(
|
||||
x=failed_marco,
|
||||
y=0.3,
|
||||
text="|",
|
||||
showarrow=False,
|
||||
font=dict(size=20),
|
||||
xanchor="center",
|
||||
yanchor="middle",
|
||||
)
|
||||
|
||||
graph_div = html.Div(
|
||||
dcc.Graph(
|
||||
figure=fig_name, config={"staticPlot": True}, className="info-bar"
|
||||
),
|
||||
className="graph-section",
|
||||
)
|
||||
direct_internal_items = []
|
||||
|
||||
for categoria in data[data[section_1] == marco][section_2].unique():
|
||||
specific_data = data[
|
||||
(data[section_1] == marco) & (data[section_2] == categoria)
|
||||
]
|
||||
findings_counts_categoria = (
|
||||
specific_data.groupby([section_2, "STATUS"])
|
||||
.size()
|
||||
.unstack(fill_value=0)
|
||||
)
|
||||
success_categoria = findings_counts_categoria.loc[categoria].get(
|
||||
pass_emoji, 0
|
||||
)
|
||||
failed_categoria = findings_counts_categoria.loc[categoria].get(
|
||||
fail_emoji, 0
|
||||
)
|
||||
|
||||
fig_section = go.Figure(
|
||||
[
|
||||
go.Bar(
|
||||
name="Failed",
|
||||
x=[failed_categoria],
|
||||
y=[""],
|
||||
orientation="h",
|
||||
marker=dict(color="#e77676"),
|
||||
width=[0.8],
|
||||
),
|
||||
go.Bar(
|
||||
name="Success",
|
||||
x=[success_categoria],
|
||||
y=[""],
|
||||
orientation="h",
|
||||
marker=dict(color="#45cc6e"),
|
||||
width=[0.8],
|
||||
),
|
||||
]
|
||||
)
|
||||
fig_section.update_layout(
|
||||
barmode="stack",
|
||||
margin=dict(l=10, r=10, t=10, b=10),
|
||||
paper_bgcolor="rgba(0,0,0,0)",
|
||||
plot_bgcolor="rgba(0,0,0,0)",
|
||||
showlegend=False,
|
||||
width=350,
|
||||
height=30,
|
||||
xaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
|
||||
yaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
|
||||
annotations=[
|
||||
dict(
|
||||
x=success_categoria + failed_categoria,
|
||||
y=0,
|
||||
xref="x",
|
||||
yref="y",
|
||||
text=str(success_categoria),
|
||||
showarrow=False,
|
||||
font=dict(color="#45cc6e", size=14),
|
||||
xanchor="left",
|
||||
yanchor="middle",
|
||||
),
|
||||
dict(
|
||||
x=0,
|
||||
y=0,
|
||||
xref="x",
|
||||
yref="y",
|
||||
text=str(failed_categoria),
|
||||
showarrow=False,
|
||||
font=dict(color="#e77676", size=14),
|
||||
xanchor="right",
|
||||
yanchor="middle",
|
||||
),
|
||||
],
|
||||
)
|
||||
fig_section.add_annotation(
|
||||
x=failed_categoria,
|
||||
y=0.3,
|
||||
text="|",
|
||||
showarrow=False,
|
||||
font=dict(size=20),
|
||||
xanchor="center",
|
||||
yanchor="middle",
|
||||
)
|
||||
|
||||
graph_div_section = html.Div(
|
||||
dcc.Graph(
|
||||
figure=fig_section,
|
||||
config={"staticPlot": True},
|
||||
className="info-bar-child",
|
||||
),
|
||||
className="graph-section-req",
|
||||
)
|
||||
direct_internal_items_idgrupocontrol = []
|
||||
|
||||
for idgrupocontrol in specific_data[section_3].unique():
|
||||
specific_data2 = specific_data[
|
||||
specific_data[section_3] == idgrupocontrol
|
||||
]
|
||||
findings_counts_idgrupocontrol = (
|
||||
specific_data2.groupby([section_3, "STATUS"])
|
||||
.size()
|
||||
.unstack(fill_value=0)
|
||||
)
|
||||
success_idgrupocontrol = findings_counts_idgrupocontrol.loc[
|
||||
idgrupocontrol
|
||||
].get(pass_emoji, 0)
|
||||
failed_idgrupocontrol = findings_counts_idgrupocontrol.loc[
|
||||
idgrupocontrol
|
||||
].get(fail_emoji, 0)
|
||||
|
||||
fig_idgrupocontrol = go.Figure(
|
||||
[
|
||||
go.Bar(
|
||||
name="Failed",
|
||||
x=[failed_idgrupocontrol],
|
||||
y=[""],
|
||||
orientation="h",
|
||||
marker=dict(color="#e77676"),
|
||||
width=[0.8],
|
||||
),
|
||||
go.Bar(
|
||||
name="Success",
|
||||
x=[success_idgrupocontrol],
|
||||
y=[""],
|
||||
orientation="h",
|
||||
marker=dict(color="#45cc6e"),
|
||||
width=[0.8],
|
||||
),
|
||||
]
|
||||
)
|
||||
fig_idgrupocontrol.update_layout(
|
||||
barmode="stack",
|
||||
margin=dict(l=10, r=10, t=10, b=10),
|
||||
paper_bgcolor="rgba(0,0,0,0)",
|
||||
plot_bgcolor="rgba(0,0,0,0)",
|
||||
showlegend=False,
|
||||
width=350,
|
||||
height=30,
|
||||
xaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
|
||||
yaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
|
||||
annotations=[
|
||||
dict(
|
||||
x=success_idgrupocontrol + failed_idgrupocontrol,
|
||||
y=0,
|
||||
xref="x",
|
||||
yref="y",
|
||||
text=str(success_idgrupocontrol),
|
||||
showarrow=False,
|
||||
font=dict(color="#45cc6e", size=14),
|
||||
xanchor="left",
|
||||
yanchor="middle",
|
||||
),
|
||||
dict(
|
||||
x=0,
|
||||
y=0,
|
||||
xref="x",
|
||||
yref="y",
|
||||
text=str(failed_idgrupocontrol),
|
||||
showarrow=False,
|
||||
font=dict(color="#e77676", size=14),
|
||||
xanchor="right",
|
||||
yanchor="middle",
|
||||
),
|
||||
],
|
||||
)
|
||||
fig_idgrupocontrol.add_annotation(
|
||||
x=failed_idgrupocontrol,
|
||||
y=0.3,
|
||||
text="|",
|
||||
showarrow=False,
|
||||
font=dict(size=20),
|
||||
xanchor="center",
|
||||
yanchor="middle",
|
||||
)
|
||||
|
||||
graph_div_idgrupocontrol = html.Div(
|
||||
dcc.Graph(
|
||||
figure=fig_idgrupocontrol,
|
||||
config={"staticPlot": True},
|
||||
className="info-bar-child",
|
||||
),
|
||||
className="graph-section-req",
|
||||
)
|
||||
|
||||
data_table = dash_table.DataTable(
|
||||
data=specific_data2.to_dict("records"),
|
||||
columns=[
|
||||
{"name": i, "id": i}
|
||||
for i in [
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
],
|
||||
style_table={"overflowX": "auto"},
|
||||
style_as_list_view=True,
|
||||
style_cell={"textAlign": "left", "padding": "5px"},
|
||||
)
|
||||
|
||||
internal_accordion_item_2 = dbc.AccordionItem(
|
||||
title=idgrupocontrol,
|
||||
children=[
|
||||
graph_div_idgrupocontrol,
|
||||
html.Div([data_table], className="inner-accordion-content"),
|
||||
],
|
||||
)
|
||||
direct_internal_items_idgrupocontrol.append(
|
||||
html.Div(
|
||||
[
|
||||
graph_div_idgrupocontrol,
|
||||
dbc.Accordion(
|
||||
[internal_accordion_item_2],
|
||||
start_collapsed=True,
|
||||
flush=True,
|
||||
),
|
||||
],
|
||||
className="accordion-inner--child",
|
||||
)
|
||||
)
|
||||
|
||||
internal_accordion_item = dbc.AccordionItem(
|
||||
title=categoria,
|
||||
children=direct_internal_items_idgrupocontrol,
|
||||
)
|
||||
internal_section_container = html.Div(
|
||||
[
|
||||
graph_div_section,
|
||||
dbc.Accordion(
|
||||
[internal_accordion_item], start_collapsed=True, flush=True
|
||||
),
|
||||
],
|
||||
className="accordion-inner--child",
|
||||
)
|
||||
direct_internal_items.append(internal_section_container)
|
||||
|
||||
accordion_item = dbc.AccordionItem(title=marco, children=direct_internal_items)
|
||||
section_container = html.Div(
|
||||
[
|
||||
graph_div,
|
||||
dbc.Accordion([accordion_item], start_collapsed=True, flush=True),
|
||||
],
|
||||
className="accordion-inner",
|
||||
)
|
||||
section_containers.append(section_container)
|
||||
|
||||
return html.Div(section_containers, className="compliance-data-layout")
|
||||
|
||||
|
||||
# This function extracts and compares up to two numeric values, ensuring correct sorting for version-like strings.
|
||||
def extract_numeric_values(value):
|
||||
numbers = re.findall(r"\d+", str(value))
|
||||
if len(numbers) == 3:
|
||||
return int(numbers[0]), int(numbers[1]), int(numbers[2])
|
||||
elif len(numbers) == 2:
|
||||
if len(numbers) >= 2:
|
||||
return int(numbers[0]), int(numbers[1])
|
||||
elif len(numbers) == 1:
|
||||
return int(numbers[0])
|
||||
return int(numbers[0]), 0
|
||||
return 0, 0
|
||||
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_3_levels
|
||||
from dashboard.common_methods import get_section_containers_format2
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
@@ -10,7 +10,6 @@ def get_table(data):
|
||||
[
|
||||
"REQUIREMENTS_ATTRIBUTES_NAME",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_SUBSECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
@@ -18,10 +17,6 @@ def get_table(data):
|
||||
"RESOURCEID",
|
||||
]
|
||||
]
|
||||
|
||||
return get_section_containers_3_levels(
|
||||
aux,
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_SUBSECTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_NAME",
|
||||
return get_section_containers_format2(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_NAME", "REQUIREMENTS_ATTRIBUTES_SECTION"
|
||||
)
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_3_levels
|
||||
from dashboard.common_methods import get_section_containers_format2
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
@@ -10,7 +10,6 @@ def get_table(data):
|
||||
[
|
||||
"REQUIREMENTS_ATTRIBUTES_NAME",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_SUBSECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
@@ -19,9 +18,6 @@ def get_table(data):
|
||||
]
|
||||
]
|
||||
|
||||
return get_section_containers_3_levels(
|
||||
aux,
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_SUBSECTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_NAME",
|
||||
return get_section_containers_format2(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ATTRIBUTES_NAME"
|
||||
)
|
||||
|
||||
@@ -1,24 +0,0 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_cis
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_cis(
|
||||
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
|
||||
)
|
||||
@@ -1,24 +0,0 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_cis
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_cis(
|
||||
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
|
||||
)
|
||||
@@ -1,6 +1,6 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_3_levels
|
||||
from dashboard.common_methods import get_section_containers_kisa_ismsp
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
@@ -8,7 +8,7 @@ warnings.filterwarnings("ignore")
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ATTRIBUTES_DOMAIN",
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_ATTRIBUTES_SUBDOMAIN",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
# "REQUIREMENTS_DESCRIPTION",
|
||||
@@ -20,9 +20,6 @@ def get_table(data):
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_3_levels(
|
||||
aux,
|
||||
"REQUIREMENTS_ATTRIBUTES_DOMAIN",
|
||||
"REQUIREMENTS_ATTRIBUTES_SUBDOMAIN",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
return get_section_containers_kisa_ismsp(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SUBDOMAIN", "REQUIREMENTS_ATTRIBUTES_SECTION"
|
||||
)
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_3_levels
|
||||
from dashboard.common_methods import get_section_containers_kisa_ismsp
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
@@ -8,7 +8,7 @@ warnings.filterwarnings("ignore")
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ATTRIBUTES_DOMAIN",
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_ATTRIBUTES_SUBDOMAIN",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
# "REQUIREMENTS_DESCRIPTION",
|
||||
@@ -20,9 +20,6 @@ def get_table(data):
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_3_levels(
|
||||
aux,
|
||||
"REQUIREMENTS_ATTRIBUTES_DOMAIN",
|
||||
"REQUIREMENTS_ATTRIBUTES_SUBDOMAIN",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
return get_section_containers_kisa_ismsp(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SUBDOMAIN", "REQUIREMENTS_ATTRIBUTES_SECTION"
|
||||
)
|
||||
|
||||
@@ -1,24 +0,0 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_cis
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_cis(
|
||||
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
|
||||
)
|
||||
@@ -1,24 +0,0 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_cis
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_cis(
|
||||
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
|
||||
)
|
||||
@@ -1,24 +0,0 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_cis
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
|
||||
def get_table(data):
|
||||
aux = data[
|
||||
[
|
||||
"REQUIREMENTS_ID",
|
||||
"REQUIREMENTS_DESCRIPTION",
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
"CHECKID",
|
||||
"STATUS",
|
||||
"REGION",
|
||||
"ACCOUNTID",
|
||||
"RESOURCEID",
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_cis(
|
||||
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
|
||||
)
|
||||
@@ -1,6 +1,6 @@
|
||||
import warnings
|
||||
|
||||
from dashboard.common_methods import get_section_containers_cis
|
||||
from dashboard.common_methods import get_section_containers_format3
|
||||
|
||||
warnings.filterwarnings("ignore")
|
||||
|
||||
@@ -19,6 +19,6 @@ def get_table(data):
|
||||
]
|
||||
].copy()
|
||||
|
||||
return get_section_containers_cis(
|
||||
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
|
||||
return get_section_containers_format3(
|
||||
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ID"
|
||||
)
|
||||
@@ -57,9 +57,8 @@ def create_layout_overview(
|
||||
html.Div(className="flex", id="azure_card", n_clicks=0),
|
||||
html.Div(className="flex", id="gcp_card", n_clicks=0),
|
||||
html.Div(className="flex", id="k8s_card", n_clicks=0),
|
||||
html.Div(className="flex", id="m365_card", n_clicks=0),
|
||||
],
|
||||
className="grid gap-x-4 mb-[30px] sm:grid-cols-2 lg:grid-cols-5",
|
||||
className="grid gap-x-4 mb-[30px] sm:grid-cols-2 lg:grid-cols-4",
|
||||
),
|
||||
html.H4(
|
||||
"Count of Findings by severity",
|
||||
@@ -132,7 +131,7 @@ def create_layout_compliance(
|
||||
html.A(
|
||||
[
|
||||
html.Img(src="assets/favicon.ico", className="w-5 mr-3"),
|
||||
html.Span("Subscribe to Prowler Cloud"),
|
||||
html.Span("Subscribe to prowler SaaS"),
|
||||
],
|
||||
href="https://prowler.pro/",
|
||||
target="_blank",
|
||||
|
||||
@@ -76,8 +76,6 @@ def load_csv_files(csv_files):
|
||||
result = result.replace("_AZURE", " - AZURE")
|
||||
if "KUBERNETES" in result:
|
||||
result = result.replace("_KUBERNETES", " - KUBERNETES")
|
||||
if "M65" in result:
|
||||
result = result.replace("_M65", " - M65")
|
||||
results.append(result)
|
||||
|
||||
unique_results = set(results)
|
||||
@@ -269,15 +267,6 @@ def display_data(
|
||||
data["REQUIREMENTS_ATTRIBUTES_PROFILE"] = data[
|
||||
"REQUIREMENTS_ATTRIBUTES_PROFILE"
|
||||
].apply(lambda x: x.split(" - ")[0])
|
||||
|
||||
# Add the column ACCOUNTID to the data if the provider is m65
|
||||
if "m365" in analytics_input:
|
||||
data.rename(columns={"TENANTID": "ACCOUNTID"}, inplace=True)
|
||||
data.rename(columns={"LOCATION": "REGION"}, inplace=True)
|
||||
if "REQUIREMENTS_ATTRIBUTES_PROFILE" in data.columns:
|
||||
data["REQUIREMENTS_ATTRIBUTES_PROFILE"] = data[
|
||||
"REQUIREMENTS_ATTRIBUTES_PROFILE"
|
||||
].apply(lambda x: x.split(" - ")[0])
|
||||
# Filter the chosen level of the CIS
|
||||
if is_level_1:
|
||||
data = data[data["REQUIREMENTS_ATTRIBUTES_PROFILE"] == "Level 1"]
|
||||
@@ -408,13 +397,7 @@ def display_data(
|
||||
compliance_module = importlib.import_module(
|
||||
f"dashboard.compliance.{current}"
|
||||
)
|
||||
data = data.drop_duplicates(
|
||||
subset=["CHECKID", "STATUS", "MUTED", "RESOURCEID", "STATUSEXTENDED"]
|
||||
)
|
||||
|
||||
if "threatscore" in analytics_input:
|
||||
data = get_threatscore_mean_by_pillar(data)
|
||||
|
||||
data.drop_duplicates(keep="first", inplace=True)
|
||||
table = compliance_module.get_table(data)
|
||||
except ModuleNotFoundError:
|
||||
table = html.Div(
|
||||
@@ -433,9 +416,6 @@ def display_data(
|
||||
)
|
||||
|
||||
df = data.copy()
|
||||
# Remove Muted rows
|
||||
if "MUTED" in df.columns:
|
||||
df = df[df["MUTED"] == "False"]
|
||||
df = df.groupby(["STATUS"]).size().reset_index(name="counts")
|
||||
df = df.sort_values(by=["counts"], ascending=False)
|
||||
|
||||
@@ -450,9 +430,6 @@ def display_data(
|
||||
if "pci" in analytics_input:
|
||||
pie_2 = get_bar_graph(df, "REQUIREMENTS_ID")
|
||||
current_filter = "req_id"
|
||||
elif "threatscore" in analytics_input:
|
||||
pie_2 = get_table_prowler_threatscore(df)
|
||||
current_filter = "threatscore"
|
||||
elif (
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION" in df.columns
|
||||
and not df["REQUIREMENTS_ATTRIBUTES_SECTION"].isnull().values.any()
|
||||
@@ -511,13 +488,6 @@ def display_data(
|
||||
pie_2, f"Top 5 failed {current_filter} by requirements"
|
||||
)
|
||||
|
||||
if "threatscore" in analytics_input:
|
||||
security_level_graph = get_graph(
|
||||
pie_2,
|
||||
"Pillar Score by requirements (1 = Lowest Risk, 5 = Highest Risk)",
|
||||
margin_top=0,
|
||||
)
|
||||
|
||||
return (
|
||||
table_output,
|
||||
overall_status_result_graph,
|
||||
@@ -531,7 +501,7 @@ def display_data(
|
||||
)
|
||||
|
||||
|
||||
def get_graph(pie, title, margin_top=7):
|
||||
def get_graph(pie, title):
|
||||
return [
|
||||
html.Span(
|
||||
title,
|
||||
@@ -544,7 +514,7 @@ def get_graph(pie, title, margin_top=7):
|
||||
"display": "flex",
|
||||
"justify-content": "center",
|
||||
"align-items": "center",
|
||||
"margin-top": f"{margin_top}%",
|
||||
"margin-top": "7%",
|
||||
},
|
||||
),
|
||||
]
|
||||
@@ -648,87 +618,3 @@ def get_table(current_compliance, table):
|
||||
className="relative flex flex-col bg-white shadow-provider rounded-xl px-4 py-3 flex-wrap w-full",
|
||||
),
|
||||
]
|
||||
|
||||
|
||||
def get_threatscore_mean_by_pillar(df):
|
||||
modified_df = df[df["STATUS"] == "FAIL"]
|
||||
|
||||
modified_df["REQUIREMENTS_ATTRIBUTES_LEVELOFRISK"] = pd.to_numeric(
|
||||
modified_df["REQUIREMENTS_ATTRIBUTES_LEVELOFRISK"], errors="coerce"
|
||||
)
|
||||
|
||||
pillar_means = (
|
||||
modified_df.groupby("REQUIREMENTS_ATTRIBUTES_SECTION")[
|
||||
"REQUIREMENTS_ATTRIBUTES_LEVELOFRISK"
|
||||
]
|
||||
.mean()
|
||||
.round(2)
|
||||
)
|
||||
|
||||
output = []
|
||||
for pillar, mean in pillar_means.items():
|
||||
output.append(f"{pillar} - [{mean}]")
|
||||
|
||||
for value in output:
|
||||
if value.split(" - ")[0] in df["REQUIREMENTS_ATTRIBUTES_SECTION"].values:
|
||||
df.loc[
|
||||
df["REQUIREMENTS_ATTRIBUTES_SECTION"] == value.split(" - ")[0],
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION",
|
||||
] = value
|
||||
return df
|
||||
|
||||
|
||||
def get_table_prowler_threatscore(df):
|
||||
df = df[df["STATUS"] == "FAIL"]
|
||||
|
||||
# Delete " - " from the column REQUIREMENTS_ATTRIBUTES_SECTION
|
||||
df["REQUIREMENTS_ATTRIBUTES_SECTION"] = (
|
||||
df["REQUIREMENTS_ATTRIBUTES_SECTION"].str.split(" - ").str[0]
|
||||
)
|
||||
|
||||
df["REQUIREMENTS_ATTRIBUTES_LEVELOFRISK"] = pd.to_numeric(
|
||||
df["REQUIREMENTS_ATTRIBUTES_LEVELOFRISK"], errors="coerce"
|
||||
)
|
||||
|
||||
score_df = (
|
||||
df.groupby("REQUIREMENTS_ATTRIBUTES_SECTION")[
|
||||
"REQUIREMENTS_ATTRIBUTES_LEVELOFRISK"
|
||||
]
|
||||
.mean()
|
||||
.reset_index()
|
||||
.rename(
|
||||
columns={
|
||||
"REQUIREMENTS_ATTRIBUTES_SECTION": "Pillar",
|
||||
"REQUIREMENTS_ATTRIBUTES_LEVELOFRISK": "Score",
|
||||
}
|
||||
)
|
||||
)
|
||||
|
||||
fig = px.bar(
|
||||
score_df,
|
||||
x="Pillar",
|
||||
y="Score",
|
||||
color="Score",
|
||||
color_continuous_scale=[
|
||||
"#45cc6e",
|
||||
"#f4d44d",
|
||||
"#e77676",
|
||||
], # verde → amarillo → rojo
|
||||
hover_data={"Score": True, "Pillar": True},
|
||||
labels={"Score": "Average Risk Score", "Pillar": "Section"},
|
||||
height=400,
|
||||
)
|
||||
|
||||
fig.update_layout(
|
||||
xaxis_title="Pillar",
|
||||
yaxis_title="Level of Risk",
|
||||
margin=dict(l=20, r=20, t=30, b=20),
|
||||
plot_bgcolor="rgba(0,0,0,0)",
|
||||
paper_bgcolor="rgba(0,0,0,0)",
|
||||
coloraxis_colorbar=dict(title="Risk"),
|
||||
)
|
||||
|
||||
return dcc.Graph(
|
||||
figure=fig,
|
||||
style={"height": "25rem", "width": "40rem"},
|
||||
)
|
||||
|
||||
@@ -74,9 +74,6 @@ gcp_provider_logo = html.Img(
|
||||
ks8_provider_logo = html.Img(
|
||||
src="assets/images/providers/k8s_provider.png", alt="k8s provider"
|
||||
)
|
||||
m365_provider_logo = html.Img(
|
||||
src="assets/images/providers/m365_provider.png", alt="m365 provider"
|
||||
)
|
||||
|
||||
|
||||
def load_csv_files(csv_files):
|
||||
@@ -226,8 +223,6 @@ else:
|
||||
accounts.append(account + " - AZURE")
|
||||
if "gcp" in list(data[data["ACCOUNT_NAME"] == account]["PROVIDER"]):
|
||||
accounts.append(account + " - GCP")
|
||||
if "m365" in list(data[data["ACCOUNT_NAME"] == account]["PROVIDER"]):
|
||||
accounts.append(account + " - M365")
|
||||
|
||||
if "ACCOUNT_UID" in data.columns:
|
||||
for account in data["ACCOUNT_UID"].unique():
|
||||
@@ -278,8 +273,6 @@ else:
|
||||
services.append(service + " - AZURE")
|
||||
if "gcp" in list(data[data["SERVICE_NAME"] == service]["PROVIDER"]):
|
||||
services.append(service + " - GCP")
|
||||
if "m365" in list(data[data["SERVICE_NAME"] == service]["PROVIDER"]):
|
||||
services.append(service + " - M365")
|
||||
|
||||
services = ["All"] + services
|
||||
services = [
|
||||
@@ -492,7 +485,6 @@ else:
|
||||
Output("azure_card", "children"),
|
||||
Output("gcp_card", "children"),
|
||||
Output("k8s_card", "children"),
|
||||
Output("m365_card", "children"),
|
||||
Output("subscribe_card", "children"),
|
||||
Output("info-file-over", "title"),
|
||||
Output("severity-filter", "value"),
|
||||
@@ -507,7 +499,6 @@ else:
|
||||
Output("azure_card", "n_clicks"),
|
||||
Output("gcp_card", "n_clicks"),
|
||||
Output("k8s_card", "n_clicks"),
|
||||
Output("m365_card", "n_clicks"),
|
||||
],
|
||||
Input("cloud-account-filter", "value"),
|
||||
Input("region-filter", "value"),
|
||||
@@ -522,7 +513,6 @@ else:
|
||||
Input("azure_card", "n_clicks"),
|
||||
Input("gcp_card", "n_clicks"),
|
||||
Input("k8s_card", "n_clicks"),
|
||||
Input("m365_card", "n_clicks"),
|
||||
Input("sort_button_check_name", "n_clicks"),
|
||||
Input("sort_button_severity", "n_clicks"),
|
||||
Input("sort_button_status", "n_clicks"),
|
||||
@@ -544,7 +534,6 @@ def filter_data(
|
||||
azure_clicks,
|
||||
gcp_clicks,
|
||||
k8s_clicks,
|
||||
m365_clicks,
|
||||
sort_button_check_name,
|
||||
sort_button_severity,
|
||||
sort_button_status,
|
||||
@@ -565,7 +554,6 @@ def filter_data(
|
||||
azure_clicks = 0
|
||||
gcp_clicks = 0
|
||||
k8s_clicks = 0
|
||||
m365_clicks = 0
|
||||
if azure_clicks > 0:
|
||||
filtered_data = data.copy()
|
||||
if azure_clicks % 2 != 0 and "azure" in list(data["PROVIDER"]):
|
||||
@@ -573,7 +561,6 @@ def filter_data(
|
||||
aws_clicks = 0
|
||||
gcp_clicks = 0
|
||||
k8s_clicks = 0
|
||||
m365_clicks = 0
|
||||
if gcp_clicks > 0:
|
||||
filtered_data = data.copy()
|
||||
if gcp_clicks % 2 != 0 and "gcp" in list(data["PROVIDER"]):
|
||||
@@ -581,7 +568,6 @@ def filter_data(
|
||||
aws_clicks = 0
|
||||
azure_clicks = 0
|
||||
k8s_clicks = 0
|
||||
m365_clicks = 0
|
||||
if k8s_clicks > 0:
|
||||
filtered_data = data.copy()
|
||||
if k8s_clicks % 2 != 0 and "kubernetes" in list(data["PROVIDER"]):
|
||||
@@ -589,15 +575,6 @@ def filter_data(
|
||||
aws_clicks = 0
|
||||
azure_clicks = 0
|
||||
gcp_clicks = 0
|
||||
m365_clicks = 0
|
||||
if m365_clicks > 0:
|
||||
filtered_data = data.copy()
|
||||
if m365_clicks % 2 != 0 and "m365" in list(data["PROVIDER"]):
|
||||
filtered_data = filtered_data[filtered_data["PROVIDER"] == "m365"]
|
||||
aws_clicks = 0
|
||||
azure_clicks = 0
|
||||
gcp_clicks = 0
|
||||
k8s_clicks = 0
|
||||
|
||||
# For all the data, we will add to the status column the value 'MUTED (FAIL)' and 'MUTED (PASS)' depending on the value of the column 'STATUS' and 'MUTED'
|
||||
if "MUTED" in filtered_data.columns:
|
||||
@@ -698,8 +675,6 @@ def filter_data(
|
||||
all_account_names.append(account)
|
||||
if "gcp" in list(data[data["ACCOUNT_NAME"] == account]["PROVIDER"]):
|
||||
all_account_names.append(account)
|
||||
if "m365" in list(data[data["ACCOUNT_NAME"] == account]["PROVIDER"]):
|
||||
all_account_names.append(account)
|
||||
|
||||
all_items = all_account_ids + all_account_names + ["All"]
|
||||
|
||||
@@ -717,8 +692,6 @@ def filter_data(
|
||||
cloud_accounts_options.append(item + " - AZURE")
|
||||
if "gcp" in list(data[data["ACCOUNT_NAME"] == item]["PROVIDER"]):
|
||||
cloud_accounts_options.append(item + " - GCP")
|
||||
if "m365" in list(data[data["ACCOUNT_NAME"] == item]["PROVIDER"]):
|
||||
cloud_accounts_options.append(item + " - M365")
|
||||
|
||||
# Filter ACCOUNT
|
||||
if cloud_account_values == ["All"]:
|
||||
@@ -817,7 +790,6 @@ def filter_data(
|
||||
service_filter_options = ["All"]
|
||||
|
||||
all_items = filtered_data["SERVICE_NAME"].unique()
|
||||
|
||||
for item in all_items:
|
||||
if item not in service_filter_options and item.__class__.__name__ == "str":
|
||||
if "aws" in list(
|
||||
@@ -836,10 +808,6 @@ def filter_data(
|
||||
filtered_data[filtered_data["SERVICE_NAME"] == item]["PROVIDER"]
|
||||
):
|
||||
service_filter_options.append(item + " - GCP")
|
||||
if "m365" in list(
|
||||
filtered_data[filtered_data["SERVICE_NAME"] == item]["PROVIDER"]
|
||||
):
|
||||
service_filter_options.append(item + " - M365")
|
||||
|
||||
# Filter Service
|
||||
if service_values == ["All"]:
|
||||
@@ -1267,10 +1235,6 @@ def filter_data(
|
||||
filtered_data.loc[
|
||||
filtered_data["ACCOUNT_UID"] == account, "ACCOUNT_UID"
|
||||
] = (account + " - GCP")
|
||||
if "m365" in list(data[data["ACCOUNT_UID"] == account]["PROVIDER"]):
|
||||
filtered_data.loc[
|
||||
filtered_data["ACCOUNT_UID"] == account, "ACCOUNT_UID"
|
||||
] = (account + " - M365")
|
||||
|
||||
table_collapsible = []
|
||||
for item in filtered_data.to_dict("records"):
|
||||
@@ -1338,17 +1302,14 @@ def filter_data(
|
||||
k8s_card = create_provider_card(
|
||||
"kubernetes", ks8_provider_logo, "Clusters", full_filtered_data
|
||||
)
|
||||
m365_card = create_provider_card(
|
||||
"m365", m365_provider_logo, "Accounts", full_filtered_data
|
||||
)
|
||||
|
||||
# Subscribe to Prowler Cloud card
|
||||
# Subscribe to prowler SaaS card
|
||||
subscribe_card = [
|
||||
html.Div(
|
||||
html.A(
|
||||
[
|
||||
html.Img(src="assets/favicon.ico", className="w-5 mr-3"),
|
||||
html.Span("Subscribe to Prowler Cloud"),
|
||||
html.Span("Subscribe to prowler SaaS"),
|
||||
],
|
||||
href="https://prowler.pro/",
|
||||
target="_blank",
|
||||
@@ -1385,7 +1346,6 @@ def filter_data(
|
||||
azure_card,
|
||||
gcp_card,
|
||||
k8s_card,
|
||||
m365_card,
|
||||
subscribe_card,
|
||||
list_files,
|
||||
severity_values,
|
||||
@@ -1400,7 +1360,6 @@ def filter_data(
|
||||
azure_clicks,
|
||||
gcp_clicks,
|
||||
k8s_clicks,
|
||||
m365_clicks,
|
||||
)
|
||||
else:
|
||||
return (
|
||||
@@ -1418,7 +1377,6 @@ def filter_data(
|
||||
azure_card,
|
||||
gcp_card,
|
||||
k8s_card,
|
||||
m365_card,
|
||||
subscribe_card,
|
||||
list_files,
|
||||
severity_values,
|
||||
@@ -1433,7 +1391,6 @@ def filter_data(
|
||||
azure_clicks,
|
||||
gcp_clicks,
|
||||
k8s_clicks,
|
||||
m365_clicks,
|
||||
)
|
||||
|
||||
|
||||
|
||||
@@ -130,12 +130,10 @@ Prowler for M365 currently supports the following authentication types:
|
||||
|
||||
|
||||
???+ warning
|
||||
For Prowler App only the Service Principal with User Credentials authentication method is supported.
|
||||
For Prowler App only the Service Principal with an application authentication method is supported.
|
||||
|
||||
### Service Principal authentication
|
||||
|
||||
Authentication flag: `--sp-env-auth`
|
||||
|
||||
To allow Prowler assume the service principal identity to start the scan it is needed to configure the following environment variables:
|
||||
|
||||
```console
|
||||
@@ -145,14 +143,9 @@ export AZURE_TENANT_ID="XXXXXXXXX"
|
||||
```
|
||||
|
||||
If you try to execute Prowler with the `--sp-env-auth` flag and those variables are empty or not exported, the execution is going to fail.
|
||||
Follow the instructions in the [Create Prowler Service Principal](../tutorials/microsoft365/getting-started-m365.md#create-the-service-principal-app) section to create a service principal.
|
||||
|
||||
With this credentials you will only be able to run the checks that work through MS Graph, this means that you won't run all the provider. If you want to scan all the checks from M365 you will need to use the recommended authentication method.
|
||||
Follow the instructions in the [Create Prowler Service Principal](../tutorials/azure/create-prowler-service-principal.md) section to create a service principal.
|
||||
|
||||
### Service Principal and User Credentials authentication (recommended)
|
||||
|
||||
Authentication flag: `--env-auth`
|
||||
|
||||
This authentication method follows the same approach as the service principal method but introduces two additional environment variables for user credentials: `M365_USER` and `M365_ENCRYPTED_PASSWORD`.
|
||||
|
||||
```console
|
||||
@@ -163,322 +156,24 @@ export M365_USER="your_email@example.com"
|
||||
export M365_ENCRYPTED_PASSWORD="6500780061006d0070006c006500700061007300730077006f0072006400" # replace this to yours
|
||||
```
|
||||
|
||||
These two new environment variables are **required** to execute the PowerShell modules needed to retrieve information from M365 services. Prowler uses Service Principal authentication to access Microsoft Graph and user credentials to authenticate to Microsoft PowerShell modules.
|
||||
These two new environment variables are required to execute the PowerShell modules needed to retrieve information from M365 services. Prowler will use service principal authentication to log into MS Graph and user credentials to authenticate to Microsoft PowerShell modules.
|
||||
|
||||
- `M365_USER` should be your Microsoft account email using the default domain. This means it must look like `example@YourCompany.onmicrosoft.com`.
|
||||
The `M365_USER` should be your Microsoft account email, and `M365_ENCRYPTED_PASSWORD` must be an encrypted SecureString.
|
||||
To convert your password into a valid encrypted string, run the following commands in PowerShell:
|
||||
|
||||
To ensure that you are using the default domain you can see how to verify it [here](../tutorials/microsoft365/getting-started-m365.md#step-1-obtain-your-domain).
|
||||
```console
|
||||
$securePassword = ConvertTo-SecureString "examplepassword" -AsPlainText -Force
|
||||
$encryptedPassword = $securePassword | ConvertFrom-SecureString
|
||||
```
|
||||
|
||||
If you don't have a user created with that domain, Prowler will not work as it will not be able to ensure both app an user belong to the same tenant. To proceed, you can either create a new user with that domain or modify the domain of an existing user.
|
||||
|
||||

|
||||
|
||||
- `M365_ENCRYPTED_PASSWORD` must be an encrypted SecureString. To convert your password into a valid encrypted string, you need to use PowerShell.
|
||||
|
||||
???+ warning
|
||||
Passwords encrypted using ConvertTo-SecureString can only be decrypted on the same OS/user context. If you generate an encrypted password on macOS or Linux (both UNIX), it should fail on Windows and vice versa. As Prowler Cloud runs on UNIX if you generate your password using Windows it won't work so you'll need to generate a new password using any UNIX distro (example above)
|
||||
|
||||
If you are working from Windows and you will use your encrypted password in a different system (like for example executing Prowler in macOS or adding your password to Prowler Cloud), you will need to generate a "UNIX compatible" version of your encrypted password. This can be done using WSL which is so easy to install on Windows.
|
||||
|
||||
=== "UNIX"
|
||||
|
||||
Open a PowerShell cmd with a [supported version](requirements.md#supported-powershell-versions) and then run the following command:
|
||||
|
||||
```console
|
||||
$securePassword = ConvertTo-SecureString "examplepassword" -AsPlainText -Force
|
||||
$encryptedPassword = $securePassword | ConvertFrom-SecureString
|
||||
Write-Output $encryptedPassword
|
||||
6500780061006d0070006c006500700061007300730077006f0072006400
|
||||
```
|
||||
|
||||
If everything is done correctly, you will see the encrypted string that you need to set as the `M365_ENCRYPTED_PASSWORD` environment variable.
|
||||
|
||||
=== "Windows"
|
||||
|
||||
|
||||
How to install WSL and PowerShell on it to generate that password (you can use a different distro but this one will work for sure):
|
||||
|
||||
```console
|
||||
wsl --install -d Ubuntu-22.04
|
||||
```
|
||||
|
||||
Then, open the Ubuntu terminal and run the following commands:
|
||||
|
||||
```console
|
||||
sudo apt update && sudo apt install -y wget apt-transport-https software-properties-common
|
||||
wget -q "https://packages.microsoft.com/config/ubuntu/$(lsb_release -rs)/packages-microsoft-prod.deb"
|
||||
sudo dpkg -i packages-microsoft-prod.deb
|
||||
sudo apt update
|
||||
sudo apt install -y powershell
|
||||
pwsh
|
||||
```
|
||||
|
||||
With this done you will see now that a prompt running PowerShell with the latest version is open so here you will be able to generate your encrypted password:
|
||||
|
||||
```console
|
||||
$securePassword = ConvertTo-SecureString "examplepassword" -AsPlainText -Force
|
||||
$encryptedPassword = $securePassword | ConvertFrom-SecureString
|
||||
Write-Output $encryptedPassword
|
||||
6500780061006d0070006c006500700061007300730077006f0072006400
|
||||
```
|
||||
|
||||
If everything is done correctly, you will see the encrypted string that you need to set as the `M365_ENCRYPTED_PASSWORD` environment variable.
|
||||
If everything is done correctly, you will see the encrypted string that you need to set as the `M365_ENCRYPTED_PASSWORD` environment variable.
|
||||
```console
|
||||
Write-Output $encryptedPassword
|
||||
6500780061006d0070006c006500700061007300730077006f0072006400
|
||||
```
|
||||
|
||||
|
||||
|
||||
### Interactive Browser authentication
|
||||
|
||||
Authentication flag: `--browser-auth`
|
||||
|
||||
This authentication method requires the user to authenticate against Azure using the default browser to start the scan, also `--tenant-id` flag is required.
|
||||
|
||||
With this credentials you will only be able to run the checks that work through MS Graph, this means that you won't run all the provider. If you want to scan all the checks from M365 you will need to use the recommended authentication method.
|
||||
|
||||
Since this is a delegated permission authentication method, necessary permissions should be given to the user, not the app.
|
||||
|
||||
|
||||
### Needed permissions
|
||||
|
||||
Prowler for M365 requires two types of permission scopes to be set (if you want to run the full provider including PowerShell checks). Both must be configured using Microsoft Entra ID:
|
||||
|
||||
- **Service Principal Application Permissions**: These are set at the **application** level and are used to retrieve data from the identity being assessed:
|
||||
- `Directory.Read.All`: Required for all services.
|
||||
- `Policy.Read.All`: Required for all services.
|
||||
- `User.Read` (IMPORTANT: this must be set as **delegated**): Required for the sign-in.
|
||||
- `Sites.Read.All`: Required for SharePoint service.
|
||||
- `SharePointTenantSettings.Read.All`: Required for SharePoint service.
|
||||
|
||||
- **Powershell Modules Permissions**: These are set at the `M365_USER` level, so the user used to run Prowler must have one of the following roles:
|
||||
- `Global Reader` (recommended): this allows you to read all roles needed.
|
||||
- `Exchange Administrator` and `Teams Administrator`: user needs both roles but with this [roles](https://learn.microsoft.com/en-us/exchange/permissions-exo/permissions-exo#microsoft-365-permissions-in-exchange-online) you can access to the same information as a Global Reader (since only read access is needed, Global Reader is recommended).
|
||||
|
||||
In order to know how to assign those permissions and roles follow the instructions in the Microsoft Entra ID [permissions](../tutorials/microsoft365/getting-started-m365.md#grant-required-api-permissions) and [roles](../tutorials/microsoft365/getting-started-m365.md#assign-required-roles-to-your-user) section.
|
||||
|
||||
|
||||
### Supported PowerShell versions
|
||||
|
||||
You must have PowerShell installed to run certain M365 checks.
|
||||
Currently, we support **PowerShell version 7.4 or higher** (7.5 is recommended).
|
||||
|
||||
This requirement exists because **PowerShell 5.1** (the version that comes by default on some Windows systems) does not support several cmdlets needed to run the checks properly.
|
||||
Additionally, earlier [PowerShell Cross-Platform versions](https://learn.microsoft.com/en-us/powershell/scripting/install/powershell-support-lifecycle?view=powershell-7.5) are no longer under technical support, which may cause unexpected errors.
|
||||
|
||||
|
||||
???+ note
|
||||
Installing powershell will be only needed if you install prowler from pip or other sources, these means that the SDK and API containers contain PowerShell installed by default.
|
||||
|
||||
Installing PowerShell is different depending on your OS.
|
||||
|
||||
- [Windows](https://learn.microsoft.com/es-es/powershell/scripting/install/installing-powershell-on-windows?view=powershell-7.5#install-powershell-using-winget-recommended): you will need to update PowerShell to +7.4 to be able to run prowler, if not some checks will not show findings and the provider could not work as expected. This version of PowerShell is [supported](https://learn.microsoft.com/es-es/powershell/scripting/install/installing-powershell-on-windows?view=powershell-7.4#supported-versions-of-windows) on Windows 10, Windows 11, Windows Server 2016 and higher versions.
|
||||
|
||||
```console
|
||||
winget install --id Microsoft.PowerShell --source winget
|
||||
```
|
||||
|
||||
|
||||
- [MacOS](https://learn.microsoft.com/es-es/powershell/scripting/install/installing-powershell-on-macos?view=powershell-7.5#install-the-latest-stable-release-of-powershell): installing PowerShell on MacOS needs to have installed [brew](https://brew.sh/), once you have it is just running the command above, Pwsh is only supported in macOS 15 (Sequoia) x64 and Arm64, macOS 14 (Sonoma) x64 and Arm64, macOS 13 (Ventura) x64 and Arm64
|
||||
|
||||
```console
|
||||
brew install powershell/tap/powershell
|
||||
```
|
||||
|
||||
Once it's installed run `pwsh` on your terminal to verify it's working.
|
||||
|
||||
- Linux: installing PowerShell on Linux depends on the distro you are using:
|
||||
|
||||
- [Ubuntu](https://learn.microsoft.com/es-es/powershell/scripting/install/install-ubuntu?view=powershell-7.5#installation-via-package-repository-the-package-repository): The required version for installing PowerShell +7.4 on Ubuntu are Ubuntu 22.04 and Ubuntu 24.04. The recommended way to install it is downloading the package available on PMC. You just need to follow the following steps:
|
||||
|
||||
```console
|
||||
###################################
|
||||
# Prerequisites
|
||||
|
||||
# Update the list of packages
|
||||
sudo apt-get update
|
||||
|
||||
# Install pre-requisite packages.
|
||||
sudo apt-get install -y wget apt-transport-https software-properties-common
|
||||
|
||||
# Get the version of Ubuntu
|
||||
source /etc/os-release
|
||||
|
||||
# Download the Microsoft repository keys
|
||||
wget -q https://packages.microsoft.com/config/ubuntu/$VERSION_ID/packages-microsoft-prod.deb
|
||||
|
||||
# Register the Microsoft repository keys
|
||||
sudo dpkg -i packages-microsoft-prod.deb
|
||||
|
||||
# Delete the Microsoft repository keys file
|
||||
rm packages-microsoft-prod.deb
|
||||
|
||||
# Update the list of packages after we added packages.microsoft.com
|
||||
sudo apt-get update
|
||||
|
||||
###################################
|
||||
# Install PowerShell
|
||||
sudo apt-get install -y powershell
|
||||
|
||||
# Start PowerShell
|
||||
pwsh
|
||||
```
|
||||
|
||||
- [Alpine](https://learn.microsoft.com/es-es/powershell/scripting/install/install-alpine?view=powershell-7.5#installation-steps): The only supported version for installing PowerShell +7.4 on Alpine is Alpine 3.20. The unique way to install it is downloading the tar.gz package available on [PowerShell github](https://github.com/PowerShell/PowerShell/releases/download/v7.5.0/powershell-7.5.0-linux-musl-x64.tar.gz). You just need to follow the following steps:
|
||||
|
||||
```console
|
||||
# Install the requirements
|
||||
sudo apk add --no-cache \
|
||||
ca-certificates \
|
||||
less \
|
||||
ncurses-terminfo-base \
|
||||
krb5-libs \
|
||||
libgcc \
|
||||
libintl \
|
||||
libssl3 \
|
||||
libstdc++ \
|
||||
tzdata \
|
||||
userspace-rcu \
|
||||
zlib \
|
||||
icu-libs \
|
||||
curl
|
||||
|
||||
apk -X https://dl-cdn.alpinelinux.org/alpine/edge/main add --no-cache \
|
||||
lttng-ust \
|
||||
openssh-client \
|
||||
|
||||
# Download the powershell '.tar.gz' archive
|
||||
curl -L https://github.com/PowerShell/PowerShell/releases/download/v7.5.0/powershell-7.5.0-linux-musl-x64.tar.gz -o /tmp/powershell.tar.gz
|
||||
|
||||
# Create the target folder where powershell will be placed
|
||||
sudo mkdir -p /opt/microsoft/powershell/7
|
||||
|
||||
# Expand powershell to the target folder
|
||||
sudo tar zxf /tmp/powershell.tar.gz -C /opt/microsoft/powershell/7
|
||||
|
||||
# Set execute permissions
|
||||
sudo chmod +x /opt/microsoft/powershell/7/pwsh
|
||||
|
||||
# Create the symbolic link that points to pwsh
|
||||
sudo ln -s /opt/microsoft/powershell/7/pwsh /usr/bin/pwsh
|
||||
|
||||
# Start PowerShell
|
||||
pwsh
|
||||
```
|
||||
|
||||
- [Debian](https://learn.microsoft.com/es-es/powershell/scripting/install/install-debian?view=powershell-7.5#installation-on-debian-11-or-12-via-the-package-repository): The required version for installing PowerShell +7.4 on Debian are Debian 11 and Debian 12. The recommended way to install it is downloading the package available on PMC. You just need to follow the following steps:
|
||||
|
||||
```console
|
||||
###################################
|
||||
# Prerequisites
|
||||
|
||||
# Update the list of packages
|
||||
sudo apt-get update
|
||||
|
||||
# Install pre-requisite packages.
|
||||
sudo apt-get install -y wget
|
||||
|
||||
# Get the version of Debian
|
||||
source /etc/os-release
|
||||
|
||||
# Download the Microsoft repository GPG keys
|
||||
wget -q https://packages.microsoft.com/config/debian/$VERSION_ID/packages-microsoft-prod.deb
|
||||
|
||||
# Register the Microsoft repository GPG keys
|
||||
sudo dpkg -i packages-microsoft-prod.deb
|
||||
|
||||
# Delete the Microsoft repository GPG keys file
|
||||
rm packages-microsoft-prod.deb
|
||||
|
||||
# Update the list of packages after we added packages.microsoft.com
|
||||
sudo apt-get update
|
||||
|
||||
###################################
|
||||
# Install PowerShell
|
||||
sudo apt-get install -y powershell
|
||||
|
||||
# Start PowerShell
|
||||
pwsh
|
||||
```
|
||||
|
||||
- [Rhel](https://learn.microsoft.com/es-es/powershell/scripting/install/install-rhel?view=powershell-7.5#installation-via-the-package-repository): The required version for installing PowerShell +7.4 on Red Hat are RHEL 8 and RHEL 9. The recommended way to install it is downloading the package available on PMC. You just need to follow the following steps:
|
||||
|
||||
```console
|
||||
###################################
|
||||
# Prerequisites
|
||||
|
||||
# Get version of RHEL
|
||||
source /etc/os-release
|
||||
if [ ${VERSION_ID%.*} -lt 8 ]
|
||||
then majorver=7
|
||||
elif [ ${VERSION_ID%.*} -lt 9 ]
|
||||
then majorver=8
|
||||
else majorver=9
|
||||
fi
|
||||
|
||||
# Download the Microsoft RedHat repository package
|
||||
curl -sSL -O https://packages.microsoft.com/config/rhel/$majorver/packages-microsoft-prod.rpm
|
||||
|
||||
# Register the Microsoft RedHat repository
|
||||
sudo rpm -i packages-microsoft-prod.rpm
|
||||
|
||||
# Delete the downloaded package after installing
|
||||
rm packages-microsoft-prod.rpm
|
||||
|
||||
# Update package index files
|
||||
sudo dnf update
|
||||
# Install PowerShell
|
||||
sudo dnf install powershell -y
|
||||
```
|
||||
|
||||
- [Docker](https://learn.microsoft.com/es-es/powershell/scripting/install/powershell-in-docker?view=powershell-7.5#use-powershell-in-a-container): The following command download the latest stable versions of PowerShell:
|
||||
|
||||
```console
|
||||
docker pull mcr.microsoft.com/dotnet/sdk:9.0
|
||||
```
|
||||
|
||||
To start an interactive shell of Pwsh you just need to run:
|
||||
|
||||
```console
|
||||
docker run -it mcr.microsoft.com/dotnet/sdk:9.0 pwsh
|
||||
```
|
||||
|
||||
|
||||
### Needed PowerShell modules
|
||||
|
||||
To obtain the required data for this provider, we use several PowerShell cmdlets.
|
||||
These cmdlets come from different modules that must be installed.
|
||||
|
||||
The installation of these modules will be performed automatically if you run Prowler with the flag `--init-modules`. This an example way of running Prowler and installing the modules:
|
||||
|
||||
```console
|
||||
python3 prowler-cli.py m365 --verbose --log-level ERROR --env-auth --init-modules
|
||||
```
|
||||
|
||||
If you already have them installed, there is no problem even if you use the flag because it will automatically check if the needed modules are already installed.
|
||||
|
||||
???+ note
|
||||
Prowler installs the modules using `-Scope CurrentUser`.
|
||||
If you encounter any issues with services not working after the automatic installation, try installing the modules manually using `-Scope AllUsers` (administrator permissions are required for this).
|
||||
The command needed to install a module manually is:
|
||||
```powershell
|
||||
Install-Module -Name "ModuleName" -Scope AllUsers -Force
|
||||
```
|
||||
|
||||
The required modules are:
|
||||
|
||||
- [ExchangeOnlineManagement](https://www.powershellgallery.com/packages/ExchangeOnlineManagement/3.6.0): Minimum version 3.6.0. Required for several checks across Exchange, Defender, and Purview.
|
||||
- [MicrosoftTeams](https://www.powershellgallery.com/packages/MicrosoftTeams/6.6.0): Minimum version 6.6.0. Required for all Teams checks.
|
||||
|
||||
## GitHub
|
||||
### Authentication
|
||||
|
||||
Prowler supports multiple methods to [authenticate with GitHub](https://docs.github.com/en/rest/authentication/authenticating-to-the-rest-api). These include:
|
||||
|
||||
- **Personal Access Token (PAT)**
|
||||
- **OAuth App Token**
|
||||
- **GitHub App Credentials**
|
||||
|
||||
This flexibility allows you to scan and analyze your GitHub account, including repositories, organizations, and applications, using the method that best suits your use case.
|
||||
|
||||
The provided credentials must have the appropriate permissions to perform all the required checks.
|
||||
|
||||
???+ note
|
||||
GitHub App Credentials support less checks than other authentication methods.
|
||||
To use `--browser-auth` the user needs to authenticate against Azure using the default browser to start the scan, also `--tenant-id` flag is required.
|
||||
|
||||
|
Before Width: | Height: | Size: 65 KiB |
|
Before Width: | Height: | Size: 9.0 KiB |
|
Before Width: | Height: | Size: 313 KiB After Width: | Height: | Size: 746 KiB |
|
Before Width: | Height: | Size: 39 KiB After Width: | Height: | Size: 153 KiB |
|
Before Width: | Height: | Size: 347 KiB |
|
Before Width: | Height: | Size: 54 KiB After Width: | Height: | Size: 27 KiB |
|
Before Width: | Height: | Size: 16 KiB |
@@ -53,7 +53,7 @@ Prowler App can be installed in different ways, depending on your environment:
|
||||
You can change the environment variables in the `.env` file. Note that it is not recommended to use the default values in production environments.
|
||||
|
||||
???+ note
|
||||
There is a development mode available, you can use the file https://github.com/prowler-cloud/prowler/blob/master/docker-compose-dev.yml to run the app in development mode.
|
||||
There is a development mode available, you can use the file https://github.com/prowler-cloud/prowler/blob/master/docker-compose.dev.yml to run the app in development mode.
|
||||
|
||||
???+ warning
|
||||
Google and GitHub authentication is only available in [Prowler Cloud](https://prowler.com).
|
||||
@@ -565,16 +565,11 @@ kubectl logs prowler-XXXXX --namespace prowler-ns
|
||||
???+ note
|
||||
By default, `prowler` will scan all namespaces in your active Kubernetes context. Use the flag `--context` to specify the context to be scanned and `--namespaces` to specify the namespaces to be scanned.
|
||||
|
||||
|
||||
#### Microsoft 365
|
||||
|
||||
With M365 you need to specify which auth method is going to be used:
|
||||
|
||||
```console
|
||||
|
||||
# To use both service principal (for MSGraph) and user credentials (for PowerShell modules)
|
||||
prowler m365 --env-auth
|
||||
|
||||
# To use service principal authentication
|
||||
prowler m365 --sp-env-auth
|
||||
|
||||
@@ -586,31 +581,7 @@ prowler m365 --browser-auth --tenant-id "XXXXXXXX"
|
||||
|
||||
```
|
||||
|
||||
See more details about M365 Authentication in [Requirements](getting-started/requirements.md#microsoft-365)
|
||||
|
||||
#### GitHub
|
||||
|
||||
Prowler allows you to scan your GitHub account, including your repositories, organizations or applications.
|
||||
|
||||
There are several supported login methods:
|
||||
|
||||
```console
|
||||
# Personal Access Token (PAT):
|
||||
prowler github --personal-access-token pat
|
||||
|
||||
# OAuth App Token:
|
||||
prowler github --oauth-app-token oauth_token
|
||||
|
||||
# GitHub App Credentials:
|
||||
prowler github --github-app-id app_id --github-app-key app_key
|
||||
```
|
||||
|
||||
???+ note
|
||||
If no login method is explicitly provided, Prowler will automatically attempt to authenticate using environment variables in the following order of precedence:
|
||||
|
||||
1. `GITHUB_PERSONAL_ACCESS_TOKEN`
|
||||
2. `OAUTH_APP_TOKEN`
|
||||
3. `GITHUB_APP_ID` and `GITHUB_APP_KEY`
|
||||
See more details about M365 Authentication in [Requirements](getting-started/requirements/#microsoft-365)
|
||||
|
||||
## Prowler v2 Documentation
|
||||
For **Prowler v2 Documentation**, please check it out [here](https://github.com/prowler-cloud/prowler/blob/8818f47333a0c1c1a457453c87af0ea5b89a385f/README.md).
|
||||
|
||||
@@ -1,14 +1,14 @@
|
||||
# Getting Started with AWS on Prowler Cloud/App
|
||||
# Getting Started with AWS on Prowler Cloud
|
||||
|
||||
<iframe width="560" height="380" src="https://www.youtube-nocookie.com/embed/RPgIWOCERzY" title="Prowler Cloud Onboarding AWS" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="1"></iframe>
|
||||
|
||||
Set up your AWS account to enable security scanning using Prowler Cloud/App.
|
||||
Set up your AWS account to enable security scanning using Prowler Cloud.
|
||||
|
||||
## Requirements
|
||||
|
||||
To configure your AWS account, you’ll need:
|
||||
|
||||
1. Access to Prowler Cloud/App
|
||||
1. Access to Prowler Cloud
|
||||
2. Properly configured AWS credentials (either static or via an assumed IAM role)
|
||||
|
||||
---
|
||||
@@ -22,9 +22,9 @@ To configure your AWS account, you’ll need:
|
||||
|
||||
---
|
||||
|
||||
## Step 2: Access Prowler Cloud/App
|
||||
## Step 2: Access Prowler Cloud
|
||||
|
||||
1. Navigate to [Prowler Cloud](https://cloud.prowler.com/) or launch [Prowler App](../prowler-app.md)
|
||||
1. Navigate to [Prowler Cloud](https://cloud.prowler.com/)
|
||||
2. Go to `Configuration` > `Cloud Providers`
|
||||
|
||||

|
||||
@@ -117,7 +117,7 @@ This method grants permanent access and is the recommended setup for production
|
||||
terraform apply
|
||||
```
|
||||
|
||||
2. During `plan` and `apply`, you will be prompted for the **External ID**, which is available in the Prowler Cloud/App UI:
|
||||
2. During `plan` and `apply`, you will be prompted for the **External ID**, which is available in the Prowler Cloud UI:
|
||||
|
||||

|
||||
|
||||
@@ -135,7 +135,7 @@ This method grants permanent access and is the recommended setup for production
|
||||
|
||||

|
||||
|
||||
10. Paste the ARN into the corresponding field in Prowler Cloud/App
|
||||
10. Paste the ARN into the corresponding field in Prowler Cloud
|
||||
|
||||

|
||||
|
||||
@@ -171,7 +171,7 @@ You can also configure your AWS account using static credentials (not recommende
|
||||
|
||||

|
||||
|
||||
> ⚠️ Save these credentials securely and paste them into the Prowler Cloud/App setup screen.
|
||||
> ⚠️ Save these credentials securely and paste them into the Prowler Cloud setup screen.
|
||||
|
||||
=== "Short term credentials (Recommended)"
|
||||
|
||||
@@ -203,9 +203,9 @@ You can also configure your AWS account using static credentials (not recommende
|
||||
}
|
||||
```
|
||||
|
||||
> ⚠️ Save these credentials securely and paste them into the Prowler Cloud/App setup screen.
|
||||
> ⚠️ Save these credentials securely and paste them into the Prowler Cloud setup screen.
|
||||
|
||||
Complete the form in Prowler Cloud/App and click `Next`
|
||||
Complete the form in Prowler Cloud and click `Next`
|
||||
|
||||

|
||||
|
||||
|
||||
@@ -1,15 +1,15 @@
|
||||
# Getting Started with Azure on Prowler Cloud/App
|
||||
# Getting Started with Azure on Prowler Cloud
|
||||
|
||||
<iframe width="560" height="380" src="https://www.youtube-nocookie.com/embed/v1as8vTFlMg" title="Prowler Cloud Onboarding Azure" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="1"></iframe>
|
||||
|
||||
Set up your Azure subscription to enable security scanning using Prowler Cloud/App.
|
||||
Set up your Azure subscription to enable security scanning using Prowler Cloud.
|
||||
|
||||
## Requirements
|
||||
|
||||
To configure your Azure subscription, you’ll need:
|
||||
|
||||
1. Get the `Subscription ID`
|
||||
2. Access to Prowler Cloud/App
|
||||
2. Access to Prowler Cloud
|
||||
3. Configure authentication in Azure:
|
||||
|
||||
3.1 Create a Service Principal
|
||||
@@ -18,7 +18,7 @@ To configure your Azure subscription, you’ll need:
|
||||
|
||||
3.3 Assign permissions at the subscription level
|
||||
|
||||
4. Add the credentials to Prowler Cloud/App
|
||||
4. Add the credentials to Prowler Cloud
|
||||
|
||||
---
|
||||
|
||||
@@ -32,9 +32,9 @@ To configure your Azure subscription, you’ll need:
|
||||
|
||||
---
|
||||
|
||||
## Step 2: Access Prowler Cloud/App
|
||||
## Step 2: Access Prowler Cloud
|
||||
|
||||
1. Go to [Prowler Cloud](https://cloud.prowler.com/) or launch [Prowler App](../prowler-app.md)
|
||||
1. Go to [Prowler Cloud](https://cloud.prowler.com/)
|
||||
2. Navigate to `Configuration` > `Cloud Providers`
|
||||
|
||||

|
||||
@@ -148,13 +148,13 @@ Assign the following Microsoft Graph permissions:
|
||||
|
||||
---
|
||||
|
||||
## Step 4: Add Credentials to Prowler Cloud/App
|
||||
## Step 4: Add Credentials to Prowler Cloud
|
||||
|
||||
1. Go to your App Registration overview and copy the `Client ID` and `Tenant ID`
|
||||
|
||||

|
||||
|
||||
2. Go to Prowler Cloud/App and paste:
|
||||
2. Go to Prowler Cloud and paste:
|
||||
|
||||
- `Client ID`
|
||||
- `Tenant ID`
|
||||
|
||||
@@ -23,7 +23,55 @@ In order to see which compliance frameworks are cover by Prowler, you can use op
|
||||
prowler <provider> --list-compliance
|
||||
```
|
||||
|
||||
The full and updated list of supported compliance frameworks for each provider is available at [Prowler Hub](https://hub.prowler.com/compliance).
|
||||
### AWS
|
||||
|
||||
- `aws_account_security_onboarding_aws`
|
||||
- `aws_audit_manager_control_tower_guardrails_aws`
|
||||
- `aws_foundational_security_best_practices_aws`
|
||||
- `aws_foundational_technical_review_aws`
|
||||
- `aws_well_architected_framework_reliability_pillar_aws`
|
||||
- `aws_well_architected_framework_security_pillar_aws`
|
||||
- `cis_1.4_aws`
|
||||
- `cis_1.5_aws`
|
||||
- `cis_2.0_aws`
|
||||
- `cis_3.0_aws`
|
||||
- `cisa_aws`
|
||||
- `ens_rd2022_aws`
|
||||
- `fedramp_low_revision_4_aws`
|
||||
- `fedramp_moderate_revision_4_aws`
|
||||
- `ffiec_aws`
|
||||
- `gdpr_aws`
|
||||
- `gxp_21_cfr_part_11_aws`
|
||||
- `gxp_eu_annex_11_aws`
|
||||
- `hipaa_aws`
|
||||
- `iso27001_2013_aws`
|
||||
- `kisa_isms_p_2023_aws`
|
||||
- `kisa_isms_p_2023_korean_aws`
|
||||
- `mitre_attack_aws`
|
||||
- `nist_800_171_revision_2_aws`
|
||||
- `nist_800_53_revision_4_aws`
|
||||
- `nist_800_53_revision_5_aws`
|
||||
- `nist_csf_1.1_aws`
|
||||
- `pci_3.2.1_aws`
|
||||
- `rbi_cyber_security_framework_aws`
|
||||
- `soc2_aws`
|
||||
|
||||
### Azure
|
||||
|
||||
- `cis_2.0_azure`
|
||||
- `cis_2.1_azure`
|
||||
- `ens_rd2022_azure`
|
||||
- `mitre_attack_azure`
|
||||
|
||||
### GCP
|
||||
|
||||
- `cis_2.0_gcp`
|
||||
- `ens_rd2022_gcp`
|
||||
- `mitre_attack_gcp`
|
||||
|
||||
### Kubernetes
|
||||
|
||||
- `cis_1.8_kubernetes`
|
||||
|
||||
## List Requirements of Compliance Frameworks
|
||||
For each compliance framework, you can use option `--list-compliance-requirements` to list its requirements:
|
||||
|
||||
@@ -106,7 +106,6 @@ The following list includes all the Microsoft 365 checks with configurable varia
|
||||
|---------------------------------------------------------------|--------------------------------------------------|-----------------|
|
||||
| `entra_admin_users_sign_in_frequency_enabled` | `sign_in_frequency` | Integer |
|
||||
| `teams_external_file_sharing_restricted` | `allowed_cloud_storage_services` | List of Strings |
|
||||
| `exchange_organization_mailtips_enabled` | `recommended_mailtips_large_audience_threshold` | Integer |
|
||||
|
||||
|
||||
## Config YAML File Structure
|
||||
@@ -521,9 +520,5 @@ m365:
|
||||
#"allow_google_drive",
|
||||
#"allow_share_file",
|
||||
]
|
||||
# Exchange Organization Settings
|
||||
# m365.exchange_organization_mailtips_enabled
|
||||
recommended_mailtips_large_audience_threshold: 25 # maximum number of recipients
|
||||
|
||||
|
||||
```
|
||||
|
||||
@@ -1,20 +1,20 @@
|
||||
# Getting Started with GCP on Prowler Cloud/App
|
||||
# Getting Started with GCP on Prowler Cloud
|
||||
|
||||
<iframe width="560" height="380" src="https://www.youtube-nocookie.com/embed/v1as8vTFlMg" title="Prowler Cloud Onboarding GCP" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="1"></iframe>
|
||||
|
||||
Set up your GCP project to enable security scanning using Prowler Cloud/App.
|
||||
Set up your GCP project to enable security scanning using Prowler Cloud.
|
||||
|
||||
## Requirements
|
||||
|
||||
To configure your GCP project, you’ll need:
|
||||
|
||||
1. Get the `Project ID`
|
||||
2. Access to Prowler Cloud/App
|
||||
2. Access to Prowler Cloud
|
||||
3. Configure authentication in GCP:
|
||||
|
||||
3.1 Retrieve credentials from Google Cloud
|
||||
|
||||
4. Add the credentials to Prowler Cloud/App
|
||||
4. Add the credentials to Prowler Cloud
|
||||
|
||||
---
|
||||
|
||||
@@ -27,9 +27,9 @@ To configure your GCP project, you’ll need:
|
||||
|
||||
---
|
||||
|
||||
## Step 2: Access Prowler Cloud/App
|
||||
## Step 2: Access Prowler Cloud
|
||||
|
||||
1. Go to [Prowler Cloud](https://cloud.prowler.com/) or launch [Prowler App](../prowler-app.md)
|
||||
1. Go to [Prowler Cloud](https://cloud.prowler.com/)
|
||||
2. Navigate to `Configuration` > `Cloud Providers`
|
||||
|
||||

|
||||
@@ -86,7 +86,7 @@ To configure your GCP project, you’ll need:
|
||||
|
||||

|
||||
|
||||
8. Extract the following values for Prowler Cloud/App:
|
||||
8. Extract the following values for Prowler Cloud:
|
||||
|
||||
- `client_id`
|
||||
- `client_secret`
|
||||
@@ -96,9 +96,9 @@ To configure your GCP project, you’ll need:
|
||||
|
||||
---
|
||||
|
||||
## Step 4: Add Credentials to Prowler Cloud/App
|
||||
## Step 4: Add Credentials to Prowler Cloud
|
||||
|
||||
1. Go back to Prowler Cloud/App and enter the required credentials, then click `Next`
|
||||
1. Go back to Prowler Cloud and enter the required credentials, then click `Next`
|
||||
|
||||

|
||||
|
||||
|
||||
@@ -1,44 +0,0 @@
|
||||
# GitHub Authentication
|
||||
|
||||
Prowler supports multiple methods to [authenticate with GitHub](https://docs.github.com/en/rest/authentication/authenticating-to-the-rest-api). These include:
|
||||
|
||||
- **Personal Access Token (PAT)**
|
||||
- **OAuth App Token**
|
||||
- **GitHub App Credentials**
|
||||
|
||||
This flexibility allows you to scan and analyze your GitHub account, including repositories, organizations, and applications, using the method that best suits your use case.
|
||||
|
||||
## Supported Login Methods
|
||||
|
||||
Here are the available login methods and their respective flags:
|
||||
|
||||
### Personal Access Token (PAT)
|
||||
Use this method by providing your personal access token directly.
|
||||
|
||||
```console
|
||||
prowler github --personal-access-token pat
|
||||
```
|
||||
|
||||
### OAuth App Token
|
||||
Authenticate using an OAuth app token.
|
||||
|
||||
```console
|
||||
prowler github --oauth-app-token oauth_token
|
||||
```
|
||||
|
||||
### GitHub App Credentials
|
||||
Use GitHub App credentials by specifying the App ID and the private key.
|
||||
|
||||
```console
|
||||
prowler github --github-app-id app_id --github-app-key app_key
|
||||
```
|
||||
|
||||
### Automatic Login Method Detection
|
||||
If no login method is explicitly provided, Prowler will automatically attempt to authenticate using environment variables in the following order of precedence:
|
||||
|
||||
1. `GITHUB_PERSONAL_ACCESS_TOKEN`
|
||||
2. `OAUTH_APP_TOKEN`
|
||||
3. `GITHUB_APP_ID` and `GITHUB_APP_KEY`
|
||||
|
||||
???+ note
|
||||
Ensure the corresponding environment variables are set up before running Prowler for automatic detection if you don't plan to specify the login method.
|
||||
@@ -20,19 +20,3 @@ kubectl logs prowler-XXXXX --namespace prowler-ns
|
||||
|
||||
???+ note
|
||||
By default, `prowler` will scan all namespaces in your active Kubernetes context. Use the [`--namespace`](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/kubernetes/namespace/) flag to specify the namespace(s) to be scanned.
|
||||
|
||||
???+ tip "Identifying the cluster in reports"
|
||||
When running in in-cluster mode, the Kubernetes API does not expose the actual cluster name by default.
|
||||
|
||||
To uniquely identify the cluster in logs and reports, you can:
|
||||
|
||||
- Use the `--cluster-name` flag to manually set the cluster name:
|
||||
```bash
|
||||
prowler -p kubernetes --cluster-name production-cluster
|
||||
```
|
||||
- Or set the `CLUSTER_NAME` environment variable:
|
||||
```yaml
|
||||
env:
|
||||
- name: CLUSTER_NAME
|
||||
value: production-cluster
|
||||
```
|
||||
|
||||
@@ -1,205 +0,0 @@
|
||||
# Getting Started with M365 on Prowler Cloud/App
|
||||
|
||||
Set up your M365 account to enable security scanning using Prowler Cloud/App.
|
||||
|
||||
## Requirements
|
||||
|
||||
To configure your M365 account, you’ll need:
|
||||
|
||||
1. Obtain your `Default Domain` from the Entra ID portal.
|
||||
|
||||
2. Access Prowler Cloud/App and add a new cloud provider `Microsoft 365`.
|
||||
|
||||
3. Configure your M365 account:
|
||||
|
||||
3.1 Create the Service Principal app.
|
||||
|
||||
3.2 Grant the required API permissions.
|
||||
|
||||
3.3 Assign the required roles to your user.
|
||||
|
||||
3.4 Retrieve your encrypted password.
|
||||
|
||||
4. Add the credentials to Prowler Cloud/App.
|
||||
|
||||
## Step 1: Obtain your Domain
|
||||
|
||||
Go to the Entra ID portal, then you can search for `Domain` or go to Identity > Settings > Domain Names.
|
||||
|
||||

|
||||
|
||||
<br>
|
||||
|
||||

|
||||
|
||||
Once you are there just look for the `Default Domain` this should be something similar to `YourCompany.onmicrosoft.com`. To ensure that you are picking the correct domain just click on it and verify that the type is `Initial` and you can't delete it.
|
||||
|
||||

|
||||
|
||||
---
|
||||
|
||||
## Step 2: Access Prowler Cloud/App
|
||||
|
||||
1. Go to [Prowler Cloud](https://cloud.prowler.com/) or launch [Prowler App](../prowler-app.md)
|
||||
2. Navigate to `Configuration` > `Cloud Providers`
|
||||
|
||||

|
||||
|
||||
3. Click on `Add Cloud Provider`
|
||||
|
||||

|
||||
|
||||
4. Select `Microsoft 365`
|
||||
|
||||

|
||||
|
||||
5. Add the Domain ID and an optional alias, then click `Next`
|
||||
|
||||

|
||||
|
||||
---
|
||||
|
||||
## Step 3: Configure your M365 account
|
||||
|
||||
|
||||
### Create the Service Principal app
|
||||
|
||||
A Service Principal is required to grant Prowler the necessary privileges.
|
||||
|
||||
1. Access **Microsoft Entra ID**
|
||||
|
||||

|
||||
|
||||
2. Navigate to `Applications` > `App registrations`
|
||||
|
||||

|
||||
|
||||
3. Click `+ New registration`, complete the form, and click `Register`
|
||||
|
||||

|
||||
|
||||
4. Go to `Certificates & secrets` > `+ New client secret`
|
||||
|
||||

|
||||
|
||||
5. Fill in the required fields and click `Add`, then copy the generated value (that value will be `AZURE_CLIENT_SECRET`)
|
||||
|
||||

|
||||
|
||||
With this done you will have all the needed keys, summarized in the following table
|
||||
|
||||
| Value | Description |
|
||||
|-------|-------------|
|
||||
| Client ID | Application (client) ID |
|
||||
| Client Secret | AZURE_CLIENT_SECRET |
|
||||
| Tenant ID | Directory (tenant) ID |
|
||||
|
||||
---
|
||||
|
||||
### Grant required API permissions
|
||||
|
||||
Assign the following Microsoft Graph permissions:
|
||||
|
||||
- `Directory.Read.All`: Required for all services.
|
||||
- `Policy.Read.All`: Required for all services.
|
||||
- `User.Read` (IMPORTANT: this is set as **delegated**): Required for the sign-in.
|
||||
- `Sites.Read.All`: Required for SharePoint service.
|
||||
- `SharePointTenantSettings.Read.All`: Required for SharePoint service.
|
||||
|
||||
Follow these steps to assign the permissions:
|
||||
|
||||
1. Go to your App Registration > Select your Prowler App created before > click on `API permissions`
|
||||
|
||||

|
||||
|
||||
2. Click `+ Add a permission` > `Microsoft Graph` > `Application permissions`
|
||||
|
||||

|
||||
|
||||
3. Search and select every permission below and once all are selected click on `Add permissions`:
|
||||
|
||||
- `Directory.Read.All`
|
||||
- `Policy.Read.All`
|
||||
- `Sites.Read.All`
|
||||
- `SharePointTenantSettings.Read.All`
|
||||
|
||||

|
||||
|
||||
4. Click `Add permissions`, then grant admin consent
|
||||
|
||||

|
||||
|
||||
5. Click `+ Add a permission` > `Microsoft Graph` > `Delegated permissions`
|
||||
|
||||

|
||||
|
||||
6. Search and select:
|
||||
|
||||
- `User.Read`
|
||||
|
||||

|
||||
|
||||
7. Click `Add permissions`, then grant admin consent
|
||||
|
||||

|
||||
|
||||
---
|
||||
|
||||
### Assign required roles to your user
|
||||
|
||||
Assign one of the following roles to your User:
|
||||
|
||||
- `Global Reader` (recommended): this allows you to read all roles needed.
|
||||
- `Exchange Administrator` and `Teams Administrator`: user needs both roles but with this [roles](https://learn.microsoft.com/en-us/exchange/permissions-exo/permissions-exo#microsoft-365-permissions-in-exchange-online) you can access to the same information as a Global Reader (here you only read so that's why we recomend that role).
|
||||
|
||||
Follow these steps to assign the role:
|
||||
|
||||
1. Go to Users > All Users > Click on the email for the user you will use
|
||||
|
||||

|
||||
|
||||
2. Click `Assigned Roles`
|
||||
|
||||

|
||||
|
||||
3. Click on `Add assignments`, then search and select:
|
||||
|
||||
- `Global Reader` This is the recommended, if you want to use the others just search for them
|
||||
|
||||

|
||||
|
||||
4. Click on next, then assign the role as `Active`, and click on `Assign` to grant admin consent
|
||||
|
||||

|
||||
|
||||
---
|
||||
|
||||
### Get your encrypted password
|
||||
|
||||
For this step you will need to use PowerShell, here you will have to create your Encrypted Password based on the password of the User that you are going to use. For more information about how to generate this Password go [here](../../getting-started/requirements.md#service-principal-and-user-credentials-authentication-recommended) and follow the steps needed to obtain `M365_ENCRYPTED_PASSWORD`.
|
||||
|
||||
---
|
||||
|
||||
## Step 4: Add credentials to Prowler Cloud/App
|
||||
|
||||
1. Go to your App Registration overview and copy the `Client ID` and `Tenant ID`
|
||||
|
||||

|
||||
|
||||
2. Go to Prowler Cloud/App and paste:
|
||||
|
||||
- `Client ID`
|
||||
- `Tenant ID`
|
||||
- `AZURE_CLIENT_SECRET` from earlier
|
||||
- `M365_USER` your user using the default domain, more info [here](../../getting-started/requirements.md#service-principal-and-user-credentials-authentication-recommended)
|
||||
- `M365_ENCRYPTED_PASSWORD` generated before
|
||||
|
||||

|
||||
|
||||
3. Click `Next`
|
||||
|
||||

|
||||
|
||||
4. Click `Launch Scan`
|
||||
|
||||

|
||||
|
Before Width: | Height: | Size: 155 KiB |