Compare commits

...

195 Commits
3.2.0 ... 3.3.2

Author SHA1 Message Date
github-actions
0fa725d2d0 chore(release): 3.3.2 2023-03-30 08:18:37 +00:00
Pepe Fagoaga
2580d820ec fix(pypi): Build from release branch (#2151) 2023-03-30 10:16:03 +02:00
Pepe Fagoaga
3f3888f499 chore(poetry): lock 2023-03-30 08:22:25 +02:00
Sergio Garcia
b20a532ab5 chore(regions_update): Changes in regions for AWS services. (#2149)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-30 08:20:49 +02:00
Pepe Fagoaga
b11c9991a3 fix(ssm): Handle ValidationException when retrieving documents (#2146) 2023-03-30 08:20:49 +02:00
Nacho Rivera
fd9cbe6ba3 fix(audit_info): azure subscriptions parsing error (#2147) 2023-03-30 08:20:49 +02:00
Nacho Rivera
227b1db493 fix(delete check): delete check ec2_securitygroup_in_use_without_ingress_filtering (#2148) 2023-03-30 08:20:49 +02:00
Sergio Garcia
10d8c12913 chore(regions_update): Changes in regions for AWS services. (#2145)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-30 08:20:49 +02:00
dependabot[bot]
d9a1cc333e build(deps): bump botocore from 1.29.90 to 1.29.100 (#2142)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-30 08:20:49 +02:00
dependabot[bot]
d1670c8347 build(deps): bump mkdocs-material from 9.1.3 to 9.1.4 (#2140)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-30 08:20:49 +02:00
Nacho Rivera
b03bc8fc0b fix(azure output): change default values of audit identity metadata (#2144) 2023-03-30 08:20:49 +02:00
dependabot[bot]
79c975ab9c build(deps): bump pydantic from 1.10.6 to 1.10.7 (#2139)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-30 08:20:49 +02:00
dependabot[bot]
68cf285fb9 build(deps): bump alive-progress from 3.0.1 to 3.1.0 (#2138)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-30 08:20:47 +02:00
Sergio Garcia
36ba0ffb1f chore(docs): Developer Guide - how to create a new check (#2137) 2023-03-30 08:20:22 +02:00
Pepe Fagoaga
d2b1818700 fix(resource_not_found): Handle error (#2136) 2023-03-30 08:20:22 +02:00
Nacho Rivera
2df02acbd8 feat(defender service): retrieving key dicts with get (#2129) 2023-03-30 08:20:20 +02:00
Pepe Fagoaga
7fd11b2f0e fix(s3): handle if ignore_public_acls is None (#2128) 2023-03-30 08:19:42 +02:00
Sergio Garcia
47bab87603 fix(brew): move brew formula action to the bottom (#2135) 2023-03-30 08:18:52 +02:00
Pepe Fagoaga
1835e1df98 fix(aws_provider): Fix assessment session name (#2132) 2023-03-30 08:18:52 +02:00
Sergio Garcia
dc88b033a0 chore(regions_update): Changes in regions for AWS services. (#2126)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-30 08:18:52 +02:00
Sergio Garcia
0f784e34db chore(regions_update): Changes in regions for AWS services. (#2123)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-30 08:18:52 +02:00
Ben Nugent
698f299edf fix(quickinventory): AttributError when creating inventory table (#2122) 2023-03-30 08:18:52 +02:00
Sergio Garcia
172bb36dfb fix(output bucket): solve IsADirectoryError using compliance flag (#2121) 2023-03-30 08:18:52 +02:00
Sergio Garcia
4b1e6c7493 chore(regions_update): Changes in regions for AWS services. (#2120)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-30 08:18:52 +02:00
Sergio Garcia
6d5c277c73 docs: improve reporting documentation (#2119) 2023-03-30 08:18:52 +02:00
Sergio Garcia
2d4b7ce499 docs: improve quick inventory section (#2117) 2023-03-30 08:18:52 +02:00
Toni de la Fuente
3ee3a3b2f1 docs(developer-guide): added phase 1 of the developer guide (#1904)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2023-03-30 08:18:52 +02:00
Pepe Fagoaga
e9e6ff0e24 docs: Remove list severities (#2116) 2023-03-30 08:18:52 +02:00
Sergio Garcia
e0c6de76d4 chore(version): check latest version (#2106) 2023-03-30 08:18:52 +02:00
dependabot[bot]
5c2a411982 build(deps): bump boto3 from 1.26.86 to 1.26.90 (#2114)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-21 11:04:26 +01:00
Sergio Garcia
08d65cbc41 chore(regions_update): Changes in regions for AWS services. (#2115)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-21 11:03:54 +01:00
dependabot[bot]
9d2bf429c1 build(deps): bump mkdocs-material from 9.1.2 to 9.1.3 (#2113)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-21 10:18:36 +01:00
dependabot[bot]
d34f863bd4 build(deps-dev): bump moto from 4.1.4 to 4.1.5 (#2111)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2023-03-21 09:27:44 +01:00
Sergio Garcia
b4abf1c2c7 chore(regions_update): Changes in regions for AWS services. (#2104)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-21 08:32:26 +01:00
dependabot[bot]
68baaf589e build(deps-dev): bump coverage from 7.2.1 to 7.2.2 (#2112)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-21 08:18:47 +01:00
dependabot[bot]
be74e41d84 build(deps-dev): bump openapi-spec-validator from 0.5.5 to 0.5.6 (#2110)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-21 07:52:50 +01:00
Sergio Garcia
848122b0ec chore(release): update Prowler Version to 3.3.0 (#2102)
Co-authored-by: github-actions <noreply@github.com>
2023-03-16 22:30:02 +01:00
Nacho Rivera
0edcb7c0d9 fix(ulimit check): try except when checking ulimit (#2096)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2023-03-16 17:39:46 +01:00
Pepe Fagoaga
cc58e06b5e fix(providers): Move provider's logic outside main (#2043)
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
2023-03-16 17:32:53 +01:00
Sergio Garcia
0d6ca606ea fix(ec2_securitygroup_allow_wide_open_public_ipv4): correct check title (#2101) 2023-03-16 17:25:32 +01:00
Sergio Garcia
75ee93789f chore(regions_update): Changes in regions for AWS services. (#2095)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-16 17:14:40 +01:00
Sergio Garcia
05daddafbf feat(SecurityHub): add compliance details to Security Hub findings (#2100) 2023-03-16 17:11:55 +01:00
Nacho Rivera
7bbce6725d fix(ulimit check): test only when platform is not windows (#2094) 2023-03-16 08:38:37 +01:00
Nacho Rivera
789b211586 feat(lambda_cloudtrail check): improved logic and status extended (#2092) 2023-03-15 12:32:58 +01:00
Sergio Garcia
826a043748 chore(regions_update): Changes in regions for AWS services. (#2091)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-15 12:28:03 +01:00
Sergio Garcia
6761048298 fix(cloudwatch): solve inexistent filterPattern error (#2087) 2023-03-14 14:46:34 +01:00
Sergio Garcia
738fc9acad feat(compliance): add compliance field to HTML, CSV and JSON outputs including frameworks and reqs (#2060)
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2023-03-14 14:20:46 +01:00
Sergio Garcia
43c0540de7 chore(regions_update): Changes in regions for AWS services. (#2085)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-14 13:11:02 +01:00
Sergio Garcia
2d1c3d8121 fix(emr): solve emr_cluster_publicly_accesible error (#2086) 2023-03-14 13:10:21 +01:00
dependabot[bot]
f48a5c650d build(deps-dev): bump pytest-xdist from 3.2.0 to 3.2.1 (#2084)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-14 10:21:17 +01:00
dependabot[bot]
66c18eddb8 build(deps): bump botocore from 1.29.86 to 1.29.90 (#2083)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-14 10:01:23 +01:00
dependabot[bot]
fdd2ee6365 build(deps-dev): bump bandit from 1.7.4 to 1.7.5 (#2082)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-14 09:03:46 +01:00
dependabot[bot]
c207f60ad8 build(deps): bump pydantic from 1.10.5 to 1.10.6 (#2081)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-14 08:02:28 +01:00
dependabot[bot]
0eaa95c8c0 build(deps): bump mkdocs-material from 9.1.1 to 9.1.2 (#2080)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-14 07:48:02 +01:00
Pepe Fagoaga
df2fca5935 fix(bug_report): typo in bug reporting template (#2078)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2023-03-13 18:42:34 +01:00
Toni de la Fuente
dcaf5d9c7d update(docs): update readme with new ECR alias (#2079) 2023-03-13 18:07:51 +01:00
Sergio Garcia
0112969a97 fix(compliance): add check to 2.1.5 CIS (#2077) 2023-03-13 09:25:51 +01:00
Sergio Garcia
3ec0f3d69c chore(regions_update): Changes in regions for AWS services. (#2075)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-13 07:51:13 +01:00
Pepe Fagoaga
5555d300a1 fix(bug_report): Update wording (#2074) 2023-03-10 12:21:51 +01:00
Nacho Rivera
8155ef4b60 feat(templates): New versions of issues and fr templates (#2072) 2023-03-10 10:32:17 +01:00
Sergio Garcia
a12402f6c8 chore(regions_update): Changes in regions for AWS services. (#2073)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-10 10:27:29 +01:00
Sergio Garcia
cf28b814cb fix(ec2): avoid terminated instances (#2063) 2023-03-10 08:11:35 +01:00
Pepe Fagoaga
b05f67db19 chore(actions): Missing cache in the PR (#2067) 2023-03-09 11:50:49 +01:00
Pepe Fagoaga
260f4659d5 chore(actions): Use GHA cache (#2066) 2023-03-09 10:29:16 +01:00
dependabot[bot]
9e700f298c build(deps-dev): bump pylint from 2.16.4 to 2.17.0 (#2062)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-08 15:41:22 +01:00
dependabot[bot]
56510734c4 build(deps): bump boto3 from 1.26.85 to 1.26.86 (#2061)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-08 15:14:18 +01:00
Pepe Fagoaga
3938a4d14e chore(dependabot): Change to weekly (#2057) 2023-03-08 14:41:34 +01:00
Sergio Garcia
fa3b9eeeaf chore(regions_update): Changes in regions for AWS services. (#2058)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-08 14:38:56 +01:00
dependabot[bot]
eb9d6fa25c build(deps): bump botocore from 1.29.85 to 1.29.86 (#2054)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-08 09:57:44 +01:00
Alex Nelson
b53307c1c2 docs: Corrected spelling mistake in multiacount (#2056) 2023-03-08 09:57:08 +01:00
dependabot[bot]
c3fc708a66 build(deps): bump boto3 from 1.26.82 to 1.26.85 (#2053)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-08 09:03:00 +01:00
Sergio Garcia
b34ffbe6d0 feat(inventory): add tags to quick inventory (#2051) 2023-03-07 14:20:50 +01:00
Sergio Garcia
f364315e48 chore(iam): update Prowler permissions (#2050) 2023-03-07 14:14:31 +01:00
Sergio Garcia
3ddb5a13a5 fix(ulimit): handle low ulimit OSError (#2042)
Co-authored-by: Toni de la Fuente <toni@blyx.com>
2023-03-07 13:19:24 +01:00
dependabot[bot]
a24cc399a4 build(deps-dev): bump moto from 4.1.3 to 4.1.4 (#2045)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2023-03-07 12:45:50 +01:00
Sergio Garcia
305f4b2688 chore(regions_update): Changes in regions for AWS services. (#2049)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-07 11:27:28 +01:00
dependabot[bot]
9823171d65 build(deps-dev): bump pylint from 2.16.3 to 2.16.4 (#2048)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-07 10:11:19 +01:00
dependabot[bot]
4761bd8fda build(deps): bump mkdocs-material from 9.1.0 to 9.1.1 (#2047)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-07 09:33:19 +01:00
dependabot[bot]
9c22698723 build(deps-dev): bump pytest from 7.2.1 to 7.2.2 (#2046)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-07 08:32:19 +01:00
dependabot[bot]
e3892bbcc6 build(deps): bump botocore from 1.29.84 to 1.29.85 (#2044)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-07 08:18:53 +01:00
Sergio Garcia
629b156f52 fix(quick inventory): add non-tagged s3 buckets to inventory (#2041) 2023-03-06 16:55:03 +01:00
Gary Mclean
c45dd47d34 fix(windows-path): --list-services bad split (#2028)
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
2023-03-06 14:00:07 +01:00
Sergio Garcia
ef8831f784 feat(quick_inventory): add regions to inventory table (#2026) 2023-03-06 13:41:30 +01:00
Sergio Garcia
c5a42cf5de feat(rds_instance_transport_encrypted): add new check (#1963)
Co-authored-by: Toni de la Fuente <toni@blyx.com>
2023-03-06 13:18:41 +01:00
dependabot[bot]
90ebbfc20f build(deps-dev): bump pylint from 2.16.2 to 2.16.3 (#2038)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-06 13:18:26 +01:00
Fennerr
17cd0dc91d feat(new_check): cloudwatch_log_group_no_secrets_in_logs (#1980)
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
Co-authored-by: Jeffrey Souza <JeffreySouza@users.noreply.github.com>
2023-03-06 12:16:46 +01:00
dependabot[bot]
fa1f42af59 build(deps): bump botocore from 1.29.82 to 1.29.84 (#2037)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-06 12:14:48 +01:00
Sergio Garcia
f45ea1ab53 fix(check): change cloudformation_outputs_find_secrets name (#2027) 2023-03-06 12:11:58 +01:00
Sergio Garcia
0dde3fe483 chore(poetry): add poetry checks to pre-commit (#2040) 2023-03-06 11:44:04 +01:00
dependabot[bot]
277dc7dd09 build(deps-dev): bump freezegun from 1.2.1 to 1.2.2 (#2033)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-06 11:06:23 +01:00
dependabot[bot]
3215d0b856 build(deps-dev): bump coverage from 7.1.0 to 7.2.1 (#2032)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-06 09:55:19 +01:00
dependabot[bot]
0167d5efcd build(deps): bump mkdocs-material from 9.0.15 to 9.1.0 (#2031)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-06 09:15:44 +01:00
Sergio Garcia
b48ac808a6 chore(regions_update): Changes in regions for AWS services. (#2035)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-03 10:14:20 +01:00
dependabot[bot]
616524775c build(deps-dev): bump docker from 6.0.0 to 6.0.1 (#2030)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-03 10:02:11 +01:00
dependabot[bot]
5832849b11 build(deps): bump boto3 from 1.26.81 to 1.26.82 (#2029)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-03 09:43:43 +01:00
Sergio Garcia
467c5d01e9 fix(cloudtrail): list tags only in owned trails (#2025) 2023-03-02 16:16:19 +01:00
Sergio Garcia
24711a2f39 feat(tags): add resource tags to S-W services (#2020) 2023-03-02 14:21:05 +01:00
Nacho Rivera
24e8286f35 feat(): 7 chars in dispatch commit message (#2024) 2023-03-02 14:20:31 +01:00
Sergio Garcia
e8a1378ad0 feat(tags): add resource tags to G-R services (#2009) 2023-03-02 13:56:22 +01:00
Sergio Garcia
76bb418ea9 feat(tags): add resource tags to E services (#2007)
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2023-03-02 13:55:26 +01:00
Nacho Rivera
cd8770a3e3 fix(actions): fixed dispatch commit message (#2023) 2023-03-02 13:55:03 +01:00
Sergio Garcia
da834c0935 feat(tags): add resource tags to C-D services (#2003)
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2023-03-02 13:14:53 +01:00
Nacho Rivera
024ffb1117 fix(head): Pass head commit to dispatch action (#2022) 2023-03-02 12:06:41 +01:00
Nacho Rivera
eed7ab9793 fix(iam): refactor IAM service (#2010) 2023-03-02 11:16:05 +01:00
Sergio Garcia
032feb343f feat(tags): add resource tags in A services (#1997) 2023-03-02 10:59:49 +01:00
Pepe Fagoaga
eabccba3fa fix(actions): push should be true (#2019) 2023-03-02 10:37:29 +01:00
Nacho Rivera
d86d656316 feat(dispatch): add tag info to dispatch (#2002) 2023-03-02 10:31:30 +01:00
Sergio Garcia
fa73c91b0b chore(regions_update): Changes in regions for AWS services. (#2018)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-02 10:23:59 +01:00
Pepe Fagoaga
2eee50832d fix(actions): Stop using github storage (#2016) 2023-03-02 10:23:04 +01:00
Toni de la Fuente
b40736918b docs(install): Add brew and github installation to quick start (#1991) 2023-03-02 10:21:57 +01:00
Sergio Garcia
ffb1a2e30f chore(regions_update): Changes in regions for AWS services. (#1995)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-02 10:21:41 +01:00
Sergio Garcia
d6c3c0c6c1 feat(s3_bucket_level_public_access_block): new check (#1953) 2023-03-02 10:18:27 +01:00
dependabot[bot]
ee251721ac build(deps): bump botocore from 1.29.81 to 1.29.82 (#2015)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-02 09:53:24 +01:00
dependabot[bot]
fdbb9195d5 build(deps-dev): bump moto from 4.1.2 to 4.1.3 (#2014)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-02 09:23:48 +01:00
dependabot[bot]
c68b08d9af build(deps-dev): bump black from 22.10.0 to 22.12.0 (#2013)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-02 08:59:18 +01:00
dependabot[bot]
3653bbfca0 build(deps-dev): bump flake8 from 5.0.4 to 6.0.0 (#2012)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-02 08:32:41 +01:00
dependabot[bot]
05c7cc7277 build(deps): bump boto3 from 1.26.80 to 1.26.81 (#2011)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-02 07:54:33 +01:00
Sergio Garcia
5670bf099b chore(regions_update): Changes in regions for AWS services. (#2006)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-01 10:16:58 +01:00
Nacho Rivera
0c324b0f09 fix(awslambdacloudtrail): include advanced event and all lambdas in check (#1994) 2023-03-01 10:04:06 +01:00
dependabot[bot]
968557e38e build(deps): bump botocore from 1.29.80 to 1.29.81 (#2005)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-01 08:59:54 +01:00
dependabot[bot]
882cdebacb build(deps): bump boto3 from 1.26.79 to 1.26.80 (#2004) 2023-03-01 08:40:41 +01:00
Sergio Garcia
07753e1774 feat(encryption): add new encryption category (#1999) 2023-02-28 13:42:11 +01:00
Pepe Fagoaga
5b984507fc fix(emr): KeyError EmrManagedSlaveSecurityGroup (#2000) 2023-02-28 13:41:58 +01:00
Sergio Garcia
27df481967 chore(metadata): remove tags from metadata (#1998) 2023-02-28 12:27:59 +01:00
dependabot[bot]
0943031f23 build(deps): bump mkdocs-material from 9.0.14 to 9.0.15 (#1993)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-28 11:02:59 +01:00
dependabot[bot]
2d95168de0 build(deps): bump botocore from 1.29.79 to 1.29.80 (#1992)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-28 10:46:25 +01:00
Sergio Garcia
97cae8f92c chore(brew): bump new version to brew (#1990) 2023-02-27 18:07:05 +01:00
github-actions
eb213bac92 chore(release): 3.2.4 2023-02-27 14:25:52 +01:00
Sergio Garcia
8187788b2c fix(pypi-release.yml): create PR before replicating (#1986) 2023-02-27 14:16:53 +01:00
Sergio Garcia
c80e08abce fix(compliance): solve AWS compliance dir path (#1987) 2023-02-27 14:16:17 +01:00
github-actions[bot]
42fd851e5c chore(release): update Prowler Version to 3.2.3 (#1985)
Co-authored-by: github-actions <noreply@github.com>
Co-authored-by: jfagoagas <jfagoagas@users.noreply.github.com>
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
2023-02-27 13:59:28 +01:00
Pepe Fagoaga
70e4ebccab chore(codeowners): Update team to OSS (#1984) 2023-02-27 13:31:16 +01:00
Sergio Garcia
140f87c741 chore(readme): add brew stats (#1982) 2023-02-27 13:17:48 +01:00
Pepe Fagoaga
b0d756123e fix(action): Use PathContext to get version changes (#1983) 2023-02-27 13:17:09 +01:00
Pedro Martín González
6188c92916 chore(compliance): implements dynamic handling of available compliance frameworks (#1977)
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
2023-02-27 10:47:47 +01:00
dependabot[bot]
34c6f96728 build(deps): bump boto3 from 1.26.74 to 1.26.79 (#1981)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-27 09:45:45 +01:00
dependabot[bot]
50fd047c0b build(deps): bump botocore from 1.29.78 to 1.29.79 (#1978)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-27 09:14:29 +01:00
Sergio Garcia
5bcc05b536 chore(regions_update): Changes in regions for AWS services. (#1972)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-02-24 12:10:27 +01:00
Sergio Garcia
ce7d6c8dd5 fix(service errors): solve EMR, VPC and ELBv2 service errors (#1974) 2023-02-24 10:49:54 +01:00
dependabot[bot]
d87a1e28b4 build(deps): bump alive-progress from 2.4.1 to 3.0.1 (#1965)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-24 10:12:52 +01:00
Pepe Fagoaga
227306c572 fix(acm): Fix issues with list-certificates (#1970) 2023-02-24 10:12:38 +01:00
dependabot[bot]
45c2691f89 build(deps): bump mkdocs-material from 8.2.1 to 9.0.14 (#1964)
Signed-off-by: dependabot[bot] <support@github.com>
2023-02-24 10:03:52 +01:00
Pepe Fagoaga
d0c81245b8 fix(directoryservice): tzinfo without _ (#1971) 2023-02-24 10:03:34 +01:00
dependabot[bot]
e494afb1aa build(deps): bump botocore from 1.29.74 to 1.29.78 (#1968)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-24 09:43:14 +01:00
dependabot[bot]
ecc3c1cf3b build(deps): bump azure-storage-blob from 12.14.1 to 12.15.0 (#1966)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-24 08:42:44 +01:00
dependabot[bot]
228b16416a build(deps): bump colorama from 0.4.5 to 0.4.6 (#1967)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-24 07:56:47 +01:00
Nacho Rivera
17eb74842a fix(cloudfront): handle empty objects in checks (#1962) 2023-02-23 16:57:44 +01:00
Nacho Rivera
c01ff74c73 fix(kms): handle if describe_keys returns no value 2023-02-23 15:54:23 +01:00
Sergio Garcia
f88613b26d fix(toml): add toml dependency to pypi release action (#1960) 2023-02-23 15:24:46 +01:00
Sergio Garcia
3464f4241f chore(release): 3.2.2 (#1959)
Co-authored-by: github-actions <noreply@github.com>
2023-02-23 15:10:03 +01:00
Sergio Garcia
849b703828 chore(resource-based scan): execute only applicable checks (#1934) 2023-02-23 13:30:21 +01:00
Sergio Garcia
4b935a40b6 fix(metadata): remove us-east-1 in remediation (#1958) 2023-02-23 13:19:10 +01:00
Sergio Garcia
5873a23ccb fix(key errors): solver EMR and IAM errrors (#1957) 2023-02-23 13:15:00 +01:00
Nacho Rivera
eae2786825 fix(cloudtrail): Handle when the CloudTrail bucket is in another account (#1956) 2023-02-23 13:04:32 +01:00
github-actions[bot]
6407386de5 chore(regions_update): Changes in regions for AWS services. (#1952)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-02-23 12:24:36 +01:00
Sergio Garcia
3fe950723f fix(actions): add README to docker action and filter steps for releases (#1955) 2023-02-23 12:22:41 +01:00
Sergio Garcia
52bf6acd46 chore(regions): add secret token to avoid stuck checks (#1954) 2023-02-23 12:11:54 +01:00
Sergio Garcia
9590e7d7e0 chore(poetry): make python-poetry as packaging and dependency manager (#1935)
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2023-02-23 11:50:29 +01:00
github-actions[bot]
7a08140a2d chore(regions_update): Changes in regions for AWS services. (#1950)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-02-23 08:42:36 +01:00
dependabot[bot]
d1491cfbd1 build(deps): bump boto3 from 1.26.74 to 1.26.76 (#1948)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-22 08:01:13 +01:00
dependabot[bot]
695b80549d build(deps): bump botocore from 1.29.75 to 1.29.76 (#1946)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-22 07:50:39 +01:00
Sergio Garcia
11c60a637f release: 3.2.1 (#1945) 2023-02-21 17:22:02 +01:00
Sergio Garcia
844ad70bb9 fix(cloudwatch): allow " in regex patterns (#1943) 2023-02-21 16:46:23 +01:00
Sergio Garcia
5ac7cde577 chore(iam_disable_N_days_credentials): improve checks logic (#1923) 2023-02-21 15:20:33 +01:00
Sergio Garcia
ce3ef0550f chore(Security Hub): add status extended to Security Hub (#1921) 2023-02-21 15:11:43 +01:00
Sergio Garcia
813f3e7d42 fix(errors): handle errors when S3 buckets or EC2 instances are deleted (#1942) 2023-02-21 12:31:23 +01:00
Sergio Garcia
d03f97af6b fix(regions): add unique branch name (#1941) 2023-02-21 11:53:36 +01:00
github-actions[bot]
019ab0286d chore(regions_update): Changes in regions for AWS services. (#1940)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-02-21 11:47:03 +01:00
Fennerr
c6647b4706 chore(secrets): Improve the status_extended with more information (#1937)
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
2023-02-21 11:37:20 +01:00
Sergio Garcia
f913536d88 fix(services): solve errors in EMR, RDS, S3 and VPC services (#1913) 2023-02-21 11:11:39 +01:00
dependabot[bot]
640d1bd176 build(deps-dev): bump moto from 4.1.2 to 4.1.3 (#1939)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-21 07:48:08 +01:00
dependabot[bot]
66baccf528 build(deps): bump botocore from 1.29.74 to 1.29.75 (#1938)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-21 07:32:44 +01:00
Sergio Garcia
6e6dacbace chore(security hub): add --skip-sh-update (#1911) 2023-02-20 09:58:00 +01:00
dependabot[bot]
cdbb10fb26 build(deps): bump boto3 from 1.26.72 to 1.26.74 (#1933)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-20 07:56:40 +01:00
dependabot[bot]
c34ba3918c build(deps): bump botocore from 1.29.73 to 1.29.74 (#1932)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-20 07:34:20 +01:00
Fennerr
fa228c876c fix(iam_rotate_access_key_90_days): check only active access keys (#1929)
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
2023-02-17 12:53:28 +01:00
dependabot[bot]
2f4d0af7d7 build(deps): bump botocore from 1.29.72 to 1.29.73 (#1926)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-17 12:14:23 +01:00
github-actions[bot]
2d3e5235a9 chore(regions_update): Changes in regions for AWS services. (#1927)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-02-17 11:13:13 +01:00
dependabot[bot]
8e91ccaa54 build(deps): bump boto3 from 1.26.71 to 1.26.72 (#1925)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-17 10:56:19 +01:00
Fennerr
6955658b36 fix(quick_inventory): handle ApiGateway resources (#1924)
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
2023-02-16 18:29:23 +01:00
Fennerr
dbb44401fd fix(ecs_task_definitions_no_environment_secrets): dump_env_vars is reintialised (#1922) 2023-02-16 15:59:53 +01:00
dependabot[bot]
b42ed70c84 build(deps): bump botocore from 1.29.71 to 1.29.72 (#1919)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-16 14:21:46 +01:00
dependabot[bot]
a28276d823 build(deps): bump pydantic from 1.10.4 to 1.10.5 (#1918)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-16 13:51:37 +01:00
Pepe Fagoaga
fa4b27dd0e fix(compliance): Set Version as optional and fix list (#1899)
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
2023-02-16 12:47:39 +01:00
dependabot[bot]
0be44d5c49 build(deps): bump boto3 from 1.26.70 to 1.26.71 (#1920)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-16 12:38:10 +01:00
github-actions[bot]
2514596276 chore(regions_update): Changes in regions for AWS services. (#1910)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-02-16 11:56:10 +01:00
dependabot[bot]
7008d2a953 build(deps): bump botocore from 1.29.70 to 1.29.71 (#1909)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-15 07:39:16 +01:00
dependabot[bot]
2539fedfc4 build(deps): bump boto3 from 1.26.69 to 1.26.70 (#1908)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-15 07:12:18 +01:00
Ignacio Dominguez
b453df7591 fix(iam-credentials-expiration): IAM password policy expires passwords fix (#1903)
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2023-02-14 13:54:58 +01:00
Pepe Fagoaga
9e5d5edcba fix(codebuild): Handle endTime in builds (#1900) 2023-02-14 11:27:53 +01:00
Nacho Rivera
2d5de6ff99 fix(cross account): cloudtrail s3 bucket logging (#1902) 2023-02-14 11:23:31 +01:00
github-actions[bot]
259e9f1c17 chore(regions_update): Changes in regions for AWS services. (#1901)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-02-14 10:28:04 +01:00
dependabot[bot]
daeb53009e build(deps): bump botocore from 1.29.69 to 1.29.70 (#1898)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-14 08:27:14 +01:00
dependabot[bot]
f12d271ca5 build(deps): bump boto3 from 1.26.51 to 1.26.69 (#1897)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-14 07:55:26 +01:00
dependabot[bot]
965185ca3b build(deps-dev): bump pylint from 2.16.1 to 2.16.2 (#1896) 2023-02-14 07:35:29 +01:00
695 changed files with 11196 additions and 5471 deletions

2
.github/CODEOWNERS vendored
View File

@@ -1 +1 @@
* @prowler-cloud/prowler-team
* @prowler-cloud/prowler-oss

View File

@@ -1,52 +0,0 @@
---
name: Bug report
about: Create a report to help us improve
title: "[Bug]: "
labels: bug, status/needs-triage
assignees: ''
---
<!--
Please use this template to create your bug report. By providing as much info as possible you help us understand the issue, reproduce it and resolve it for you quicker. Therefore, take a couple of extra minutes to make sure you have provided all info needed.
PROTIP: record your screen and attach it as a gif to showcase the issue.
- How to record and attach gif: https://bit.ly/2Mi8T6K
-->
**What happened?**
A clear and concise description of what the bug is or what is not working as expected
**How to reproduce it**
Steps to reproduce the behavior:
1. What command are you running?
2. Cloud provider you are launching
3. Environment you have like single account, multi-account, organizations, multi or single subsctiption, etc.
4. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots or Logs**
If applicable, add screenshots to help explain your problem.
Also, you can add logs (anonymize them first!). Here a command that may help to share a log
`prowler <your arguments> --log-level DEBUG --log-file $(date +%F)_debug.log` then attach here the log file.
**From where are you running Prowler?**
Please, complete the following information:
- Resource: (e.g. EC2 instance, Fargate task, Docker container manually, EKS, Cloud9, CodeBuild, workstation, etc.)
- OS: [e.g. Amazon Linux 2, Mac, Alpine, Windows, etc. ]
- Prowler Version [`prowler --version`]:
- Python version [`python --version`]:
- Pip version [`pip --version`]:
- Installation method (Are you running it from pip package or cloning the github repo?):
- Others:
**Additional context**
Add any other context about the problem here.

97
.github/ISSUE_TEMPLATE/bug_report.yml vendored Normal file
View File

@@ -0,0 +1,97 @@
name: 🐞 Bug Report
description: Create a report to help us improve
title: "[Bug]: "
labels: ["bug", "status/needs-triage"]
body:
- type: textarea
id: reproduce
attributes:
label: Steps to Reproduce
description: Steps to reproduce the behavior
placeholder: |-
1. What command are you running?
2. Cloud provider you are launching
3. Environment you have, like single account, multi-account, organizations, multi or single subscription, etc.
4. See error
validations:
required: true
- type: textarea
id: expected
attributes:
label: Expected behavior
description: A clear and concise description of what you expected to happen.
validations:
required: true
- type: textarea
id: actual
attributes:
label: Actual Result with Screenshots or Logs
description: If applicable, add screenshots to help explain your problem. Also, you can add logs (anonymize them first!). Here a command that may help to share a log `prowler <your arguments> --log-level DEBUG --log-file $(date +%F)_debug.log` then attach here the log file.
validations:
required: true
- type: dropdown
id: type
attributes:
label: How did you install Prowler?
options:
- Cloning the repository from github.com (git clone)
- From pip package (pip install prowler)
- From brew (brew install prowler)
- Docker (docker pull toniblyx/prowler)
validations:
required: true
- type: textarea
id: environment
attributes:
label: Environment Resource
description: From where are you running Prowler?
placeholder: |-
1. EC2 instance
2. Fargate task
3. Docker container locally
4. EKS
5. Cloud9
6. CodeBuild
7. Workstation
8. Other(please specify)
validations:
required: true
- type: textarea
id: os
attributes:
label: OS used
description: Which OS are you using?
placeholder: |-
1. Amazon Linux 2
2. MacOS
3. Alpine Linux
4. Windows
5. Other(please specify)
validations:
required: true
- type: input
id: prowler-version
attributes:
label: Prowler version
description: Which Prowler version are you using?
placeholder: |-
prowler --version
validations:
required: true
- type: input
id: pip-version
attributes:
label: Pip version
description: Which pip version are you using?
placeholder: |-
pip --version
validations:
required: true
- type: textarea
id: additional
attributes:
description: Additional context
label: Context
validations:
required: false

View File

@@ -0,0 +1,36 @@
name: 💡 Feature Request
description: Suggest an idea for this project
labels: ["enhancement", "status/needs-triage"]
body:
- type: textarea
id: Problem
attributes:
label: New feature motivation
description: Is your feature request related to a problem? Please describe
placeholder: |-
1. A clear and concise description of what the problem is. Ex. I'm always frustrated when
validations:
required: true
- type: textarea
id: Solution
attributes:
label: Solution Proposed
description: A clear and concise description of what you want to happen.
validations:
required: true
- type: textarea
id: Alternatives
attributes:
label: Describe alternatives you've considered
description: A clear and concise description of any alternative solutions or features you've considered.
validations:
required: true
- type: textarea
id: Context
attributes:
label: Additional context
description: Add any other context or screenshots about the feature request here.
validations:
required: false

View File

@@ -1,20 +0,0 @@
---
name: Feature request
about: Suggest an idea for this project
title: ''
labels: enhancement, status/needs-triage
assignees: ''
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.

View File

@@ -8,7 +8,7 @@ updates:
- package-ecosystem: "pip" # See documentation for possible values
directory: "/" # Location of package manifests
schedule:
interval: "daily"
interval: "weekly"
target-branch: master
labels:
- "dependencies"

View File

@@ -15,108 +15,48 @@ on:
env:
AWS_REGION_STG: eu-west-1
AWS_REGION_PLATFORM: eu-west-1
AWS_REGION_PRO: us-east-1
AWS_REGION: us-east-1
IMAGE_NAME: prowler
LATEST_TAG: latest
STABLE_TAG: stable
TEMPORARY_TAG: temporary
DOCKERFILE_PATH: ./Dockerfile
PYTHON_VERSION: 3.9
jobs:
# Lint Dockerfile using Hadolint
# dockerfile-linter:
# runs-on: ubuntu-latest
# steps:
# -
# name: Checkout
# uses: actions/checkout@v3
# -
# name: Install Hadolint
# run: |
# VERSION=$(curl --silent "https://api.github.com/repos/hadolint/hadolint/releases/latest" | \
# grep '"tag_name":' | \
# sed -E 's/.*"v([^"]+)".*/\1/' \
# ) && curl -L -o /tmp/hadolint https://github.com/hadolint/hadolint/releases/download/v${VERSION}/hadolint-Linux-x86_64 \
# && chmod +x /tmp/hadolint
# -
# name: Run Hadolint
# run: |
# /tmp/hadolint util/Dockerfile
# Build Prowler OSS container
container-build:
container-build-push:
# needs: dockerfile-linter
runs-on: ubuntu-latest
env:
POETRY_VIRTUALENVS_CREATE: "false"
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Build
uses: docker/build-push-action@v2
with:
# Without pushing to registries
push: false
tags: ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }}
file: ${{ env.DOCKERFILE_PATH }}
outputs: type=docker,dest=/tmp/${{ env.IMAGE_NAME }}.tar
- name: Share image between jobs
uses: actions/upload-artifact@v2
with:
name: ${{ env.IMAGE_NAME }}.tar
path: /tmp/${{ env.IMAGE_NAME }}.tar
# Lint Prowler OSS container using Dockle
# container-linter:
# needs: container-build
# runs-on: ubuntu-latest
# steps:
# -
# name: Get container image from shared
# uses: actions/download-artifact@v2
# with:
# name: ${{ env.IMAGE_NAME }}.tar
# path: /tmp
# -
# name: Load Docker image
# run: |
# docker load --input /tmp/${{ env.IMAGE_NAME }}.tar
# docker image ls -a
# -
# name: Install Dockle
# run: |
# VERSION=$(curl --silent "https://api.github.com/repos/goodwithtech/dockle/releases/latest" | \
# grep '"tag_name":' | \
# sed -E 's/.*"v([^"]+)".*/\1/' \
# ) && curl -L -o dockle.deb https://github.com/goodwithtech/dockle/releases/download/v${VERSION}/dockle_${VERSION}_Linux-64bit.deb \
# && sudo dpkg -i dockle.deb && rm dockle.deb
# -
# name: Run Dockle
# run: dockle ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }}
# Push Prowler OSS container to registries
container-push:
# needs: container-linter
needs: container-build
runs-on: ubuntu-latest
permissions:
id-token: write
contents: read # This is required for actions/checkout
steps:
- name: Get container image from shared
uses: actions/download-artifact@v2
- name: Setup python (release)
if: github.event_name == 'release'
uses: actions/setup-python@v2
with:
name: ${{ env.IMAGE_NAME }}.tar
path: /tmp
- name: Load Docker image
python-version: ${{ env.PYTHON_VERSION }}
- name: Install dependencies (release)
if: github.event_name == 'release'
run: |
docker load --input /tmp/${{ env.IMAGE_NAME }}.tar
docker image ls -a
pipx install poetry
pipx inject poetry poetry-bumpversion
- name: Update Prowler version (release)
if: github.event_name == 'release'
run: |
poetry version ${{ github.event.release.tag_name }}
- name: Login to DockerHub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to Public ECR
uses: docker/login-action@v2
with:
@@ -124,55 +64,53 @@ jobs:
username: ${{ secrets.PUBLIC_ECR_AWS_ACCESS_KEY_ID }}
password: ${{ secrets.PUBLIC_ECR_AWS_SECRET_ACCESS_KEY }}
env:
AWS_REGION: ${{ env.AWS_REGION_PRO }}
AWS_REGION: ${{ env.AWS_REGION }}
- name: Tag (latest)
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Build container image (latest)
if: github.event_name == 'push'
run: |
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.LATEST_TAG }}
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.LATEST_TAG }}
- # Push to master branch - push "latest" tag
name: Push (latest)
if: github.event_name == 'push'
run: |
docker push ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.LATEST_TAG }}
docker push ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.LATEST_TAG }}
- # Tag the new release (stable and release tag)
name: Tag (release)
if: github.event_name == 'release'
run: |
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
- # Push the new release (stable and release tag)
name: Push (release)
if: github.event_name == 'release'
run: |
docker push ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
docker push ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
docker push ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
docker push ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
- name: Delete artifacts
if: always()
uses: geekyeggo/delete-artifact@v1
uses: docker/build-push-action@v2
with:
name: ${{ env.IMAGE_NAME }}.tar
push: true
tags: |
${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.LATEST_TAG }}
${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.LATEST_TAG }}
file: ${{ env.DOCKERFILE_PATH }}
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Build container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@v2
with:
# Use local context to get changes
# https://github.com/docker/build-push-action#path-context
context: .
push: true
tags: |
${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
file: ${{ env.DOCKERFILE_PATH }}
cache-from: type=gha
cache-to: type=gha,mode=max
dispatch-action:
needs: container-push
needs: container-build-push
runs-on: ubuntu-latest
steps:
- name: Get latest commit info
if: github.event_name == 'push'
run: |
LATEST_COMMIT_HASH=$(echo ${{ github.event.after }} | cut -b -7)
echo "LATEST_COMMIT_HASH=${LATEST_COMMIT_HASH}" >> $GITHUB_ENV
- name: Dispatch event for latest
if: github.event_name == 'push'
run: |
curl https://api.github.com/repos/${{ secrets.DISPATCH_OWNER }}/${{ secrets.DISPATCH_REPO }}/dispatches -H "Accept: application/vnd.github+json" -H "Authorization: Bearer ${{ secrets.ACCESS_TOKEN }}" -H "X-GitHub-Api-Version: 2022-11-28" --data '{"event_type":"dispatch","client_payload":{"version":"latest"}'
curl https://api.github.com/repos/${{ secrets.DISPATCH_OWNER }}/${{ secrets.DISPATCH_REPO }}/dispatches -H "Accept: application/vnd.github+json" -H "Authorization: Bearer ${{ secrets.ACCESS_TOKEN }}" -H "X-GitHub-Api-Version: 2022-11-28" --data '{"event_type":"dispatch","client_payload":{"version":"latest", "tag": "${{ env.LATEST_COMMIT_HASH }}"}}'
- name: Dispatch event for release
if: github.event_name == 'release'
run: |

View File

@@ -17,42 +17,48 @@ jobs:
steps:
- uses: actions/checkout@v3
- name: Install poetry
run: |
python -m pip install --upgrade pip
pipx install poetry
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
cache: 'poetry'
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install pipenv
pipenv install --dev
pipenv run pip list
poetry install
poetry run pip list
VERSION=$(curl --silent "https://api.github.com/repos/hadolint/hadolint/releases/latest" | \
grep '"tag_name":' | \
sed -E 's/.*"v([^"]+)".*/\1/' \
) && curl -L -o /tmp/hadolint "https://github.com/hadolint/hadolint/releases/download/v${VERSION}/hadolint-Linux-x86_64" \
&& chmod +x /tmp/hadolint
- name: Poetry check
run: |
poetry lock --check
- name: Lint with flake8
run: |
pipenv run flake8 . --ignore=E266,W503,E203,E501,W605,E128 --exclude contrib
poetry run flake8 . --ignore=E266,W503,E203,E501,W605,E128 --exclude contrib
- name: Checking format with black
run: |
pipenv run black --check .
poetry run black --check .
- name: Lint with pylint
run: |
pipenv run pylint --disable=W,C,R,E -j 0 -rn -sn prowler/
poetry run pylint --disable=W,C,R,E -j 0 -rn -sn prowler/
- name: Bandit
run: |
pipenv run bandit -q -lll -x '*_test.py,./contrib/' -r .
poetry run bandit -q -lll -x '*_test.py,./contrib/' -r .
- name: Safety
run: |
pipenv run safety check
poetry run safety check
- name: Vulture
run: |
pipenv run vulture --exclude "contrib" --min-confidence 100 .
poetry run vulture --exclude "contrib" --min-confidence 100 .
- name: Hadolint
run: |
/tmp/hadolint Dockerfile --ignore=DL3013
- name: Test with pytest
run: |
pipenv run pytest tests -n auto
poetry run pytest tests -n auto

View File

@@ -5,37 +5,73 @@ on:
types: [published]
env:
GITHUB_BRANCH: ${{ github.event.release.tag_name }}
RELEASE_TAG: ${{ github.event.release.tag_name }}
jobs:
release-prowler-job:
runs-on: ubuntu-latest
env:
POETRY_VIRTUALENVS_CREATE: "false"
name: Release Prowler to PyPI
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout@v3
with:
ref: ${{ env.GITHUB_BRANCH }}
- name: setup python
uses: actions/setup-python@v2
with:
python-version: 3.9 #install the python needed
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install build toml --upgrade
- name: Build package
run: python -m build
- name: Publish prowler-cloud package to PyPI
uses: pypa/gh-action-pypi-publish@release/v1
pipx install poetry
pipx inject poetry poetry-bumpversion
- name: setup python
uses: actions/setup-python@v4
with:
password: ${{ secrets.PYPI_API_TOKEN }}
python-version: 3.9
cache: 'poetry'
- name: Change version and Build package
run: |
poetry version ${{ env.RELEASE_TAG }}
git config user.name "github-actions"
git config user.email "<noreply@github.com>"
git add prowler/config/config.py pyproject.toml
git commit -m "chore(release): ${{ env.RELEASE_TAG }}" --no-verify
git tag -fa ${{ env.RELEASE_TAG }} -m "chore(release): ${{ env.RELEASE_TAG }}"
git push -f origin ${{ env.RELEASE_TAG }}
poetry build
- name: Publish prowler package to PyPI
run: |
poetry config pypi-token.pypi ${{ secrets.PYPI_API_TOKEN }}
poetry publish
# Create pull request with new version
- name: Create Pull Request
uses: peter-evans/create-pull-request@v4
with:
token: ${{ secrets.PROWLER_ACCESS_TOKEN }}
commit-message: "chore(release): update Prowler Version to ${{ env.RELEASE_TAG }}."
branch: release-${{ env.RELEASE_TAG }}
labels: "status/waiting-for-revision, severity/low"
title: "chore(release): update Prowler Version to ${{ env.RELEASE_TAG }}"
body: |
### Description
This PR updates Prowler Version to ${{ env.RELEASE_TAG }}.
### License
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
- name: Replicate PyPi Package
run: |
rm -rf ./dist && rm -rf ./build && rm -rf prowler_cloud.egg-info
rm -rf ./dist && rm -rf ./build && rm -rf prowler.egg-info
pip install toml
python util/replicate_pypi_package.py
python -m build
- name: Publish prowler package to PyPI
uses: pypa/gh-action-pypi-publish@release/v1
poetry build
- name: Publish prowler-cloud package to PyPI
run: |
poetry config pypi-token.pypi ${{ secrets.PYPI_API_TOKEN }}
poetry publish
# Create pull request to github.com/Homebrew/homebrew-core to update prowler formula
- name: Bump Homebrew formula
uses: mislav/bump-homebrew-formula-action@v2
with:
password: ${{ secrets.PYPI_API_TOKEN }}
formula-name: prowler
base-branch: release-${{ env.RELEASE_TAG }}
env:
COMMITTER_TOKEN: ${{ secrets.PROWLER_ACCESS_TOKEN }}

View File

@@ -52,9 +52,9 @@ jobs:
- name: Create Pull Request
uses: peter-evans/create-pull-request@v4
with:
token: ${{ secrets.GITHUB_TOKEN }}
token: ${{ secrets.PROWLER_ACCESS_TOKEN }}
commit-message: "feat(regions_update): Update regions for AWS services."
branch: "aws-services-regions-updated"
branch: "aws-services-regions-updated-${{ github.sha }}"
labels: "status/waiting-for-revision, severity/low"
title: "chore(regions_update): Changes in regions for AWS services."
body: |

View File

@@ -13,6 +13,14 @@ repos:
- id: pretty-format-json
args: ["--autofix", --no-sort-keys, --no-ensure-ascii]
## TOML
- repo: https://github.com/macisamuele/language-formatters-pre-commit-hooks
rev: v2.7.0
hooks:
- id: pretty-format-toml
args: [--autofix]
files: pyproject.toml
## BASH
- repo: https://github.com/koalaman/shellcheck-precommit
rev: v0.9.0
@@ -48,10 +56,11 @@ repos:
exclude: contrib
args: ["--ignore=E266,W503,E203,E501,W605"]
- repo: https://github.com/haizaar/check-pipfile-lock
rev: v0.0.5
- repo: https://github.com/python-poetry/poetry
rev: 1.4.0 # add version here
hooks:
- id: check-pipfile-lock
- id: poetry-check
- id: poetry-lock
- repo: https://github.com/hadolint/hadolint
rev: v2.12.1-beta

23
.readthedocs.yaml Normal file
View File

@@ -0,0 +1,23 @@
# .readthedocs.yaml
# Read the Docs configuration file
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
# Required
version: 2
build:
os: "ubuntu-22.04"
tools:
python: "3.9"
jobs:
post_create_environment:
# Install poetry
# https://python-poetry.org/docs/#installing-manually
- pip install poetry
# Tell poetry to not use a virtual environment
- poetry config virtualenvs.create false
post_install:
- poetry install -E docs
mkdocs:
configuration: mkdocs.yml

View File

@@ -16,6 +16,7 @@ USER prowler
WORKDIR /home/prowler
COPY prowler/ /home/prowler/prowler/
COPY pyproject.toml /home/prowler
COPY README.md /home/prowler
# Install dependencies
ENV HOME='/home/prowler'
@@ -26,7 +27,7 @@ RUN pip install --no-cache-dir --upgrade pip && \
# Remove Prowler directory and build files
USER 0
RUN rm -rf /home/prowler/prowler /home/prowler/pyproject.toml /home/prowler/build /home/prowler/prowler_cloud.egg-info
RUN rm -rf /home/prowler/prowler /home/prowler/pyproject.toml /home/prowler/README.md /home/prowler/build /home/prowler/prowler.egg-info
USER prowler
ENTRYPOINT ["prowler"]

View File

@@ -24,11 +24,11 @@ lint: ## Lint Code
##@ PyPI
pypi-clean: ## Delete the distribution files
rm -rf ./dist && rm -rf ./build && rm -rf prowler_cloud.egg-info
rm -rf ./dist && rm -rf ./build && rm -rf prowler.egg-info
pypi-build: ## Build package
$(MAKE) pypi-clean && \
python3 -m build
poetry build
pypi-upload: ## Upload package
python3 -m twine upload --repository pypi dist/*

42
Pipfile
View File

@@ -1,42 +0,0 @@
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"
[packages]
colorama = "0.4.4"
boto3 = "1.26.3"
arnparse = "0.0.2"
botocore = "1.29.69"
pydantic = "1.9.1"
schema = "0.7.5"
shodan = "1.28.0"
detect-secrets = "1.4.0"
alive-progress = "2.4.1"
tabulate = "0.9.0"
azure-identity = "1.12.0"
azure-storage-blob = "12.14.1"
msgraph-core = "0.2.2"
azure-mgmt-subscription = "3.1.1"
azure-mgmt-authorization = "3.0.0"
azure-mgmt-security = "3.0.0"
azure-mgmt-storage = "21.0.0"
[dev-packages]
black = "22.10.0"
pylint = "2.16.1"
flake8 = "5.0.4"
bandit = "1.7.4"
safety = "2.3.1"
vulture = "2.7"
moto = "4.1.2"
docker = "6.0.0"
openapi-spec-validator = "0.5.5"
pytest = "7.2.1"
pytest-xdist = "3.2.0"
coverage = "7.1.0"
sure = "2.0.1"
freezegun = "1.2.1"
[requires]
python_version = "3.9"

1703
Pipfile.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -13,12 +13,13 @@
<a href="https://join.slack.com/t/prowler-workspace/shared_invite/zt-1hix76xsl-2uq222JIXrC7Q8It~9ZNog"><img alt="Slack Shield" src="https://img.shields.io/badge/slack-prowler-brightgreen.svg?logo=slack"></a>
<a href="https://pypi.org/project/prowler-cloud/"><img alt="Python Version" src="https://img.shields.io/pypi/v/prowler.svg"></a>
<a href="https://pypi.python.org/pypi/prowler-cloud/"><img alt="Python Version" src="https://img.shields.io/pypi/pyversions/prowler.svg"></a>
<a href="https://pypistats.org/packages/prowler"><img alt="PyPI Prowler Downloads" src="https://img.shields.io/pypi/dw/prowler.svg"></a>
<a href="https://pypistats.org/packages/prowler-cloud"><img alt="PyPI Prowler-Cloud Downloads" src="https://img.shields.io/pypi/dw/prowler-cloud.svg"></a>
<a href="https://pypistats.org/packages/prowler"><img alt="PyPI Prowler Downloads" src="https://img.shields.io/pypi/dw/prowler.svg?label=prowler%20downloads"></a>
<a href="https://pypistats.org/packages/prowler-cloud"><img alt="PyPI Prowler-Cloud Downloads" src="https://img.shields.io/pypi/dw/prowler-cloud.svg?label=prowler-cloud%20downloads"></a>
<a href="https://formulae.brew.sh/formula/prowler#default"><img alt="Brew Prowler Downloads" src="https://img.shields.io/homebrew/installs/dm/prowler?label=brew%20downloads"></a>
<a href="https://hub.docker.com/r/toniblyx/prowler"><img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/toniblyx/prowler"></a>
<a href="https://hub.docker.com/r/toniblyx/prowler"><img alt="Docker" src="https://img.shields.io/docker/cloud/build/toniblyx/prowler"></a>
<a href="https://hub.docker.com/r/toniblyx/prowler"><img alt="Docker" src="https://img.shields.io/docker/image-size/toniblyx/prowler"></a>
<a href="https://gallery.ecr.aws/o4g1s5r6/prowler"><img width="120" height=19" alt="AWS ECR Gallery" src="https://user-images.githubusercontent.com/3985464/151531396-b6535a68-c907-44eb-95a1-a09508178616.png"></a>
<a href="https://gallery.ecr.aws/prowler-cloud/prowler"><img width="120" height=19" alt="AWS ECR Gallery" src="https://user-images.githubusercontent.com/3985464/151531396-b6535a68-c907-44eb-95a1-a09508178616.png"></a>
</p>
<p align="center">
<a href="https://github.com/prowler-cloud/prowler"><img alt="Repo size" src="https://img.shields.io/github/repo-size/prowler-cloud/prowler"></a>
@@ -37,8 +38,13 @@
It contains hundreds of controls covering CIS, PCI-DSS, ISO27001, GDPR, HIPAA, FFIEC, SOC2, AWS FTR, ENS and custom security frameworks.
# 📖 Documentation
The full documentation can now be found at [https://docs.prowler.cloud](https://docs.prowler.cloud)
## Looking for Prowler v2 documentation?
For Prowler v2 Documentation, please go to https://github.com/prowler-cloud/prowler/tree/2.12.1.
# ⚙️ Install
## Pip package
@@ -48,6 +54,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
pip install prowler
prowler -v
```
More details at https://docs.prowler.cloud
## Containers
@@ -56,29 +63,24 @@ The available versions of Prowler are the following:
- `latest`: in sync with master branch (bear in mind that it is not a stable version)
- `<x.y.z>` (release): you can find the releases [here](https://github.com/prowler-cloud/prowler/releases), those are stable releases.
- `stable`: this tag always point to the latest release.
The container images are available here:
- [DockerHub](https://hub.docker.com/r/toniblyx/prowler/tags)
- [AWS Public ECR](https://gallery.ecr.aws/o4g1s5r6/prowler)
- [AWS Public ECR](https://gallery.ecr.aws/prowler-cloud/prowler)
## From Github
Python >= 3.9 is required with pip and pipenv:
Python >= 3.9 is required with pip and poetry:
```
git clone https://github.com/prowler-cloud/prowler
cd prowler
pipenv shell
pipenv install
poetry shell
poetry install
python prowler.py -v
```
# 📖 Documentation
The full documentation can now be found at [https://docs.prowler.cloud](https://docs.prowler.cloud)
# 📐✏️ High level architecture
You can run Prowler from your workstation, an EC2 instance, Fargate or any other container, Codebuild, CloudShell and Cloud9.

BIN
docs/img/output-html.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 631 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 320 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 220 KiB

View File

@@ -5,7 +5,7 @@
# Prowler Documentation
**Welcome to [Prowler Open Source v3](https://github.com/prowler-cloud/prowler/) Documentation!** 📄
**Welcome to [Prowler Open Source v3](https://github.com/prowler-cloud/prowler/) Documentation!** 📄
For **Prowler v2 Documentation**, please go [here](https://github.com/prowler-cloud/prowler/tree/2.12.0) to the branch and its README.md.
@@ -87,6 +87,23 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
prowler -v
```
=== "GitHub"
_Requirements for Developers_:
* AWS and/or Azure credentials
* `git`, `Python >= 3.9`, `pip` and `poetry` installed (`pip install poetry`)
_Commands_:
```
git clone https://github.com/prowler-cloud/prowler
cd prowler
poetry shell
poetry install
python prowler.py -v
```
=== "Amazon Linux 2"
_Requirements_:
@@ -103,6 +120,20 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
prowler -v
```
=== "Brew"
_Requirements_:
* `Brew` installed in your Mac or Linux
* AWS and/or Azure credentials
_Commands_:
``` bash
brew install prowler
prowler -v
```
=== "AWS CloudShell"
Prowler can be easely executed in AWS CloudShell but it has some prerequsites to be able to to so. AWS CloudShell is a container running with `Amazon Linux release 2 (Karoo)` that comes with Python 3.7, since Prowler requires Python >= 3.9 we need to first install a newer version of Python. Follow the steps below to successfully execute Prowler v3 in AWS CloudShell:
@@ -118,7 +149,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
./configure --enable-optimizations
sudo make altinstall
python3.9 --version
cd
cd
```
_Commands_:
@@ -154,7 +185,7 @@ The available versions of Prowler are the following:
The container images are available here:
- [DockerHub](https://hub.docker.com/r/toniblyx/prowler/tags)
- [AWS Public ECR](https://gallery.ecr.aws/o4g1s5r6/prowler)
- [AWS Public ECR](https://gallery.ecr.aws/prowler-cloud/prowler)
## High level architecture

View File

@@ -1,6 +1,6 @@
# Scan Multiple AWS Accounts
Prowler can scan multiple accounts when it is ejecuted from one account that can assume a role in those given accounts to scan using [Assume Role feature](role-assumption.md) and [AWS Organizations integration feature](organizations.md).
Prowler can scan multiple accounts when it is executed from one account that can assume a role in those given accounts to scan using [Assume Role feature](role-assumption.md) and [AWS Organizations integration feature](organizations.md).
## Scan multiple specific accounts sequentially

View File

@@ -18,13 +18,13 @@ Before sending findings to Prowler, you will need to perform next steps:
Once it is enabled, it is as simple as running the command below (for all regions):
```sh
./prowler aws -S
prowler aws -S
```
or for only one filtered region like eu-west-1:
```sh
./prowler -S -f eu-west-1
prowler -S -f eu-west-1
```
> **Note 1**: It is recommended to send only fails to Security Hub and that is possible adding `-q` to the command.
@@ -36,3 +36,12 @@ or for only one filtered region like eu-west-1:
Once you run findings for first time you will be able to see Prowler findings in Findings section:
![Screenshot 2020-10-29 at 10 29 05 PM](https://user-images.githubusercontent.com/3985464/97634676-66c9f600-1a36-11eb-9341-70feb06f6331.png)
## Skip sending updates of findings to Security Hub
By default, Prowler archives all its findings in Security Hub that have not appeared in the last scan.
You can skip this logic by using the option `--skip-sh-update` so Prowler will not archive older findings:
```sh
prowler -S --skip-sh-update
```

View File

@@ -113,7 +113,6 @@ checks_v3_to_v2_mapping = {
"ec2_securitygroup_allow_wide_open_public_ipv4": "extra778",
"ec2_securitygroup_default_restrict_traffic": "check43",
"ec2_securitygroup_from_launch_wizard": "extra7173",
"ec2_securitygroup_in_use_without_ingress_filtering": "extra74",
"ec2_securitygroup_not_used": "extra75",
"ec2_securitygroup_with_many_ingress_egress_rules": "extra777",
"ecr_repositories_lifecycle_policy_enabled": "extra7194",

View File

@@ -81,36 +81,4 @@ Standard results will be shown and additionally the framework information as the
## Create and contribute adding other Security Frameworks
If you want to create or contribute with your own security frameworks or add public ones to Prowler you need to make sure the checks are available if not you have to create your own. Then create a compliance file per provider like in `prowler/compliance/aws/` and name it as `<framework>_<version>_<provider>.json` then follow the following format to create yours.
Each file version of a framework will have the following structure at high level with the case that each framework needs to be generally identified), one requirement can be also called one control but one requirement can be linked to multiple prowler checks.:
- `Framework`: string. Indistiguish name of the framework, like CIS
- `Provider`: string. Provider where the framework applies, such as AWS, Azure, OCI,...
- `Version`: string. Version of the framework itself, like 1.4 for CIS.
- `Requirements`: array of objects. Include all requirements or controls with the mapping to Prowler.
- `Requirements_Id`: string. Unique identifier per each requirement in the specific framework
- `Requirements_Description`: string. Description as in the framework.
- `Requirements_Attributes`: array of objects. Includes all needed attributes per each requirement, like levels, sections, etc. Whatever helps to create a dedicated report with the result of the findings. Attributes would be taken as closely as possible from the framework's own terminology directly.
- `Requirements_Checks`: array. Prowler checks that are needed to prove this requirement. It can be one or multiple checks. In case of no automation possible this can be empty.
```
{
"Framework": "<framework>-<provider>",
"Version": "<version>",
"Requirements": [
{
"Id": "<unique-id>",
"Description": "Requiemente full description",
"Checks": [
"Here is the prowler check or checks that is going to be executed"
],
"Attributes": [
{
<Add here your custom attributes.>
}
]
}
```
Finally, to have a proper output file for your reports, your framework data model has to be created in `prowler/lib/outputs/models.py` and also the CLI table output in `prowler/lib/outputs/compliance.py`.
This information is part of the Developer Guide and can be found here: https://docs.prowler.cloud/en/latest/tutorials/developer-guide/.

View File

@@ -0,0 +1,281 @@
# Developer Guide
You can extend Prowler in many different ways, in most cases you will want to create your own checks and compliance security frameworks, here is where you can learn about how to get started with it. We also include how to create custom outputs, integrations and more.
## Get the code and install all dependencies
First of all, you need a version of Python 3.9 or higher and also pip installed to be able to install all dependencies requred. Once that is satisfied go a head and clone the repo:
```
git clone https://github.com/prowler-cloud/prowler
cd prowler
```
For isolation and avoid conflicts with other environments, we recommend usage of `poetry`:
```
pip install poetry
```
Then install all dependencies including the ones for developers:
```
poetry install
poetry shell
```
## Contributing with your code or fixes to Prowler
This repo has git pre-commit hooks managed via the pre-commit tool. Install it how ever you like, then in the root of this repo run:
```
pre-commit install
```
You should get an output like the following:
```
pre-commit installed at .git/hooks/pre-commit
```
Before we merge any of your pull requests we pass checks to the code, we use the following tools and automation to make sure the code is secure and dependencies up-to-dated (these should have been already installed if you ran `pipenv install -d`):
- `bandit` for code security review.
- `safety` and `dependabot` for dependencies.
- `hadolint` and `dockle` for our containers security.
- `snyk` in Docker Hub.
- `clair` in Amazon ECR.
- `vulture`, `flake8`, `black` and `pylint` for formatting and best practices.
You can see all dependencies in file `Pipfile`.
## Create a new check for a Provider
### If the check you want to create belongs to an existing service
To create a new check, you will need to create a folder inside the specific service, i.e. `prowler/providers/<provider>/services/<service>/<check_name>/`, with the name of check following the pattern: `service_subservice_action`.
Inside that folder, create the following files:
- An empty `__init__.py`: to make Python treat this check folder as a package.
- A `check_name.py` containing the check's logic, for example:
```
# Import the Check_Report of the specific provider
from prowler.lib.check.models import Check, Check_Report_AWS
# Import the client of the specific service
from prowler.providers.aws.services.ec2.ec2_client import ec2_client
# Create the class for the check
class ec2_ebs_volume_encryption(Check):
def execute(self):
findings = []
# Iterate the service's asset that want to be analyzed
for volume in ec2_client.volumes:
# Initialize a Check Report for each item and assign the region, resource_id, resource_arn and resource_tags
report = Check_Report_AWS(self.metadata())
report.region = volume.region
report.resource_id = volume.id
report.resource_arn = volume.arn
report.resource_tags = volume.tags
# Make the logic with conditions and create a PASS and a FAIL with a status and a status_extended
if volume.encrypted:
report.status = "PASS"
report.status_extended = f"EBS Snapshot {volume.id} is encrypted."
else:
report.status = "FAIL"
report.status_extended = f"EBS Snapshot {volume.id} is unencrypted."
findings.append(report) # Append a report for each item
return findings
```
- A `check_name.metadata.json` containing the check's metadata, for example:
```
{
"Provider": "aws",
"CheckID": "ec2_ebs_volume_encryption",
"CheckTitle": "Ensure there are no EBS Volumes unencrypted.",
"CheckType": [
"Data Protection"
],
"ServiceName": "ec2",
"SubServiceName": "volume",
"ResourceIdTemplate": "arn:partition:service:region:account-id:resource-id",
"Severity": "medium",
"ResourceType": "AwsEc2Volume",
"Description": "Ensure there are no EBS Volumes unencrypted.",
"Risk": "Data encryption at rest prevents data visibility in the event of its unauthorized access or theft.",
"RelatedUrl": "",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "",
"Terraform": ""
},
"Recommendation": {
"Text": "Encrypt all EBS volumes and Enable Encryption by default You can configure your AWS account to enforce the encryption of the new EBS volumes and snapshot copies that you create. For example; Amazon EBS encrypts the EBS volumes created when you launch an instance and the snapshots that you copy from an unencrypted snapshot.",
"Url": "https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/EBSEncryption.html"
}
},
"Categories": [
"encryption"
],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""
}
```
### If the check you want to create belongs to a service not supported already by Prowler you will need to create a new service first
To create a new service, you will need to create a folder inside the specific provider, i.e. `prowler/providers/<provider>/services/<service>/`.
Inside that folder, create the following files:
- An empty `__init__.py`: to make Python treat this service folder as a package.
- A `<service>_service.py`, containing all the service's logic and API Calls:
```
# You must import the following libraries
import threading
from typing import Optional
from pydantic import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
from prowler.providers.aws.aws_provider import generate_regional_clients
# Create a class for the Service
################## <Service>
class <Service>:
def __init__(self, audit_info):
self.service = "<service>" # The name of the service boto3 client
self.session = audit_info.audit_session
self.audited_account = audit_info.audited_account
self.audit_resources = audit_info.audit_resources
self.regional_clients = generate_regional_clients(self.service, audit_info)
self.<items> = [] # Create an empty list of the items to be gathered, e.g., instances
self.__threading_call__(self.__describe_<items>__)
self.__describe_<item>__() # Optionally you can create another function to retrieve more data about each item
def __get_session__(self):
return self.session
def __threading_call__(self, call):
threads = []
for regional_client in self.regional_clients.values():
threads.append(threading.Thread(target=call, args=(regional_client,)))
for t in threads:
t.start()
for t in threads:
t.join()
def __describe_<items>__(self, regional_client):
"""Get ALL <Service> <Items>"""
logger.info("<Service> - Describing <Items>...")
try:
describe_<items>_paginator = regional_client.get_paginator("describe_<items>") # Paginator to get every item
for page in describe_<items>_paginator.paginate():
for <item> in page["<Items>"]:
if not self.audit_resources or (
is_resource_filtered(<item>["<item_arn>"], self.audit_resources)
):
self.<items>.append(
<Item>(
arn=stack["<item_arn>"],
name=stack["<item_name>"],
tags=stack.get("Tags", []),
region=regional_client.region,
)
)
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def __describe_<item>__(self):
"""Get Details for a <Service> <Item>"""
logger.info("<Service> - Describing <Item> to get specific details...")
try:
for <item> in self.<items>:
<item>_details = self.regional_clients[<item>.region].describe_<item>(
<Attribute>=<item>.name
)
# For example, check if item is Public
<item>.public = <item>_details.get("Public", False)
except Exception as error:
logger.error(
f"{<item>.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
class <Item>(BaseModel):
"""<Item> holds a <Service> <Item>"""
arn: str
"""<Items>[].Arn"""
name: str
"""<Items>[].Name"""
public: bool
"""<Items>[].Public"""
tags: Optional[list] = []
region: str
```
- A `<service>_client_.py`, containing the initialization of the service's class we have just created so the service's checks can use them:
```
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.<service>.<service>_service import <Service>
<service>_client = <Service>(current_audit_info)
```
## Create a new security compliance framework
If you want to create or contribute with your own security frameworks or add public ones to Prowler you need to make sure the checks are available if not you have to create your own. Then create a compliance file per provider like in `prowler/compliance/aws/` and name it as `<framework>_<version>_<provider>.json` then follow the following format to create yours.
Each file version of a framework will have the following structure at high level with the case that each framework needs to be generally identified, one requirement can be also called one control but one requirement can be linked to multiple prowler checks.:
- `Framework`: string. Indistiguish name of the framework, like CIS
- `Provider`: string. Provider where the framework applies, such as AWS, Azure, OCI,...
- `Version`: string. Version of the framework itself, like 1.4 for CIS.
- `Requirements`: array of objects. Include all requirements or controls with the mapping to Prowler.
- `Requirements_Id`: string. Unique identifier per each requirement in the specific framework
- `Requirements_Description`: string. Description as in the framework.
- `Requirements_Attributes`: array of objects. Includes all needed attributes per each requirement, like levels, sections, etc. Whatever helps to create a dedicated report with the result of the findings. Attributes would be taken as closely as possible from the framework's own terminology directly.
- `Requirements_Checks`: array. Prowler checks that are needed to prove this requirement. It can be one or multiple checks. In case of no automation possible this can be empty.
```
{
"Framework": "<framework>-<provider>",
"Version": "<version>",
"Requirements": [
{
"Id": "<unique-id>",
"Description": "Requiemente full description",
"Checks": [
"Here is the prowler check or checks that is going to be executed"
],
"Attributes": [
{
<Add here your custom attributes.>
}
]
},
...
]
}
```
Finally, to have a proper output file for your reports, your framework data model has to be created in `prowler/lib/outputs/models.py` and also the CLI table output in `prowler/lib/outputs/compliance.py`.
## Create a custom output format
## Create a new integration
## Contribute with documentation
We use `mkdocs` to build this Prowler documentation site so you can easely contribute back with new docs or improving them.
1. Install `mkdocs` with your favorite package manager.
2. Inside the `prowler` repository folder run `mkdocs serve` and point your browser to `http://localhost:8000` and you will see live changes to your local copy of this documentation site.
3. Make all needed changes to docs or add new documents. To do so just edit existing md files inside `prowler/docs` and if you are adding a new section or file please make sure you add it to `mkdocs.yaml` file in the root folder of the Prowler repo.
4. Once you are done with changes, please send a pull request to us for review and merge. Thank you in advance!
## Want some swag as appreciation for your contribution?
If you are like us and you love swag, we are happy to thank you for your contribution with some laptop stickers or whatever other swag we may have at that time. Please, tell us more details and your pull request link in our [Slack workspace here](https://join.slack.com/t/prowler-workspace/shared_invite/zt-1hix76xsl-2uq222JIXrC7Q8It~9ZNog). You can also reach out to Toni de la Fuente on Twitter [here](https://twitter.com/ToniBlyx), his DMs are open.

View File

@@ -52,14 +52,15 @@ prowler <provider> -e/--excluded-checks ec2 rds
prowler <provider> -C/--checks-file <checks_list>.json
```
## Severities
Each check of Prowler has a severity, there are options related with it:
## Severities
Each of Prowler's checks has a severity, which can be:
- informational
- low
- medium
- high
- critical
- List the available checks in the provider:
```console
prowler <provider> --list-severities
```
- Execute specific severity(s):
To execute specific severity(s):
```console
prowler <provider> --severity critical high
```

View File

@@ -33,9 +33,8 @@ Several checks analyse resources that are exposed to the Internet, these are:
- ec2_instance_internet_facing_with_instance_profile
- ec2_instance_public_ip
- ec2_networkacl_allow_ingress_any_port
- ec2_securitygroup_allow_ingress_from_internet_to_any_port
- ec2_securitygroup_allow_wide_open_public_ipv4
- ec2_securitygroup_in_use_without_ingress_filtering
- ec2_securitygroup_allow_ingress_from_internet_to_any_port
- ecr_repositories_not_publicly_accessible
- eks_control_plane_endpoint_access_restricted
- eks_endpoints_not_publicly_accessible

View File

@@ -14,4 +14,6 @@ prowler <provider> -i
- Also, it creates by default a CSV and JSON to see detailed information about the resources extracted.
![Quick Inventory Example](../img/quick-inventory.png)
![Quick Inventory Example](../img/quick-inventory.jpg)
> The inventorying process is done with `resourcegroupstaggingapi` calls (except for the IAM resources which are done with Boto3 API calls.)

View File

@@ -46,9 +46,11 @@ Prowler supports natively the following output formats:
Hereunder is the structure for each of the supported report formats by Prowler:
### HTML
![HTML Output](../img/output-html.png)
### CSV
| ASSESSMENT_START_TIME | FINDING_UNIQUE_ID | PROVIDER | PROFILE | ACCOUNT_ID | ACCOUNT_NAME | ACCOUNT_EMAIL | ACCOUNT_ARN | ACCOUNT_ORG | ACCOUNT_TAGS | REGION | CHECK_ID | CHECK_TITLE | CHECK_TYPE | STATUS | STATUS_EXTENDED | SERVICE_NAME | SUBSERVICE_NAME | SEVERITY | RESOURCE_ID | RESOURCE_ARN | RESOURCE_TYPE | RESOURCE_DETAILS | RESOURCE_TAGS | DESCRIPTION | RISK | RELATED_URL | REMEDIATION_RECOMMENDATION_TEXT | REMEDIATION_RECOMMENDATION_URL | REMEDIATION_RECOMMENDATION_CODE_NATIVEIAC | REMEDIATION_RECOMMENDATION_CODE_TERRAFORM | REMEDIATION_RECOMMENDATION_CODE_CLI | REMEDIATION_RECOMMENDATION_CODE_OTHER | CATEGORIES | DEPENDS_ON | RELATED_TO | NOTES |
| ------- | ----------- | ------ | -------- | ------------ | ----------- | ---------- | ---------- | --------------------- | -------------------------- | -------------- | ----------------- | ------------------------ | --------------- | ---------- | ----------------- | --------- | -------------- | ----------------- | ------------------ | --------------------- | -------------------- | ------------------- | ------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- |
| ASSESSMENT_START_TIME | FINDING_UNIQUE_ID | PROVIDER | PROFILE | ACCOUNT_ID | ACCOUNT_NAME | ACCOUNT_EMAIL | ACCOUNT_ARN | ACCOUNT_ORG | ACCOUNT_TAGS | REGION | CHECK_ID | CHECK_TITLE | CHECK_TYPE | STATUS | STATUS_EXTENDED | SERVICE_NAME | SUBSERVICE_NAME | SEVERITY | RESOURCE_ID | RESOURCE_ARN | RESOURCE_TYPE | RESOURCE_DETAILS | RESOURCE_TAGS | DESCRIPTION | COMPLIANCE | RISK | RELATED_URL | REMEDIATION_RECOMMENDATION_TEXT | REMEDIATION_RECOMMENDATION_URL | REMEDIATION_RECOMMENDATION_CODE_NATIVEIAC | REMEDIATION_RECOMMENDATION_CODE_TERRAFORM | REMEDIATION_RECOMMENDATION_CODE_CLI | REMEDIATION_RECOMMENDATION_CODE_OTHER | CATEGORIES | DEPENDS_ON | RELATED_TO | NOTES |
| ------- | ----------- | ------ | -------- | ------------ | ----------- | ---------- | ---------- | --------------------- | -------------------------- | -------------- | ----------------- | ------------------------ | --------------- | ---------- | ----------------- | --------- | -------------- | ----------------- | ------------------ | --------------------- | -------------------- | ------------------- | ------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- |
### JSON
@@ -71,6 +73,10 @@ Hereunder is the structure for each of the supported report formats by Prowler:
"Severity": "low",
"ResourceId": "rds-instance-id",
"ResourceArn": "",
"ResourceTags": {
"test": "test",
"enironment": "dev"
},
"ResourceType": "AwsRdsDbInstance",
"ResourceDetails": "",
"Description": "Ensure RDS instances have minor version upgrade enabled.",
@@ -89,7 +95,15 @@ Hereunder is the structure for each of the supported report formats by Prowler:
}
},
"Categories": [],
"Notes": ""
"Notes": "",
"Compliance": {
"CIS-1.4": [
"1.20"
],
"CIS-1.5": [
"1.20"
]
}
},{
"AssessmentStartTime": "2022-12-01T14:16:57.354413",
"FindingUniqueId": "",
@@ -109,7 +123,7 @@ Hereunder is the structure for each of the supported report formats by Prowler:
"ResourceId": "rds-instance-id",
"ResourceArn": "",
"ResourceType": "AwsRdsDbInstance",
"ResourceDetails": "",
"ResourceTags": {},
"Description": "Ensure RDS instances have minor version upgrade enabled.",
"Risk": "Auto Minor Version Upgrade is a feature that you can enable to have your database automatically upgraded when a new minor database engine version is available. Minor version upgrades often patch security vulnerabilities and fix bugs and therefore should be applied.",
"RelatedUrl": "https://aws.amazon.com/blogs/database/best-practices-for-upgrading-amazon-rds-to-major-and-minor-versions-of-postgresql/",
@@ -126,7 +140,8 @@ Hereunder is the structure for each of the supported report formats by Prowler:
}
},
"Categories": [],
"Notes": ""
"Notes": "",
"Compliance: {}
}]
```
@@ -166,7 +181,30 @@ Hereunder is the structure for each of the supported report formats by Prowler:
],
"Compliance": {
"Status": "PASSED",
"RelatedRequirements": []
"RelatedRequirements": [
"CISA your-systems-2 booting-up-thing-to-do-first-3",
"CIS-1.5 2.3.2",
"AWS-Foundational-Security-Best-Practices rds",
"RBI-Cyber-Security-Framework annex_i_6",
"FFIEC d3-cc-pm-b-1 d3-cc-pm-b-3"
],
"AssociatedStandards": [
{
"StandardsId": "CISA"
},
{
"StandardsId": "CIS-1.5"
},
{
"StandardsId": "AWS-Foundational-Security-Best-Practices"
},
{
"StandardsId": "RBI-Cyber-Security-Framework"
},
{
"StandardsId": "FFIEC"
}
]
},
"Remediation": {
"Recommendation": {
@@ -205,7 +243,30 @@ Hereunder is the structure for each of the supported report formats by Prowler:
],
"Compliance": {
"Status": "PASSED",
"RelatedRequirements": []
"RelatedRequirements": [
"CISA your-systems-2 booting-up-thing-to-do-first-3",
"CIS-1.5 2.3.2",
"AWS-Foundational-Security-Best-Practices rds",
"RBI-Cyber-Security-Framework annex_i_6",
"FFIEC d3-cc-pm-b-1 d3-cc-pm-b-3"
],
"AssociatedStandards": [
{
"StandardsId": "CISA"
},
{
"StandardsId": "CIS-1.5"
},
{
"StandardsId": "AWS-Foundational-Security-Best-Practices"
},
{
"StandardsId": "RBI-Cyber-Security-Framework"
},
{
"StandardsId": "FFIEC"
}
]
},
"Remediation": {
"Recommendation": {

View File

@@ -37,6 +37,7 @@ nav:
- Logging: tutorials/logging.md
- Allowlist: tutorials/allowlist.md
- Pentesting: tutorials/pentesting.md
- Developer Guide: tutorials/developer-guide.md
- AWS:
- Assume Role: tutorials/aws/role-assumption.md
- AWS Security Hub: tutorials/aws/securityhub.md
@@ -50,6 +51,7 @@ nav:
- Azure:
- Authentication: tutorials/azure/authentication.md
- Subscriptions: tutorials/azure/subscriptions.md
- Developer Guide: tutorials/developer-guide.md
- Security: security.md
- Contact Us: contact.md
- Troubleshooting: troubleshooting.md

View File

@@ -4,7 +4,7 @@ AWSTemplateFormatVersion: '2010-09-09'
# aws cloudformation create-stack \
# --capabilities CAPABILITY_IAM --capabilities CAPABILITY_NAMED_IAM \
# --template-body "file://create_role_to_assume_cfn.yaml" \
# --stack-name "ProwlerExecRole" \
# --stack-name "ProwlerScanRole" \
# --parameters "ParameterKey=AuthorisedARN,ParameterValue=arn:aws:iam::123456789012:root"
#
Description: |
@@ -13,7 +13,7 @@ Description: |
account to assume that role. The role name and the ARN of the trusted user can all be passed
to the CloudFormation stack as parameters. Then you can run Prowler to perform a security
assessment with a command like:
./prowler -A <THIS_ACCOUNT_ID> -R ProwlerExecRole
prowler --role ProwlerScanRole.ARN
Parameters:
AuthorisedARN:
Description: |
@@ -22,12 +22,12 @@ Parameters:
Type: String
ProwlerRoleName:
Description: |
Name of the IAM role that will have these policies attached. Default: ProwlerExecRole
Name of the IAM role that will have these policies attached. Default: ProwlerScanRole
Type: String
Default: 'ProwlerExecRole'
Default: 'ProwlerScanRole'
Resources:
ProwlerExecRole:
ProwlerScanRole:
Type: AWS::IAM::Role
Properties:
AssumeRolePolicyDocument:
@@ -42,31 +42,49 @@ Resources:
# Bool:
# 'aws:MultiFactorAuthPresent': true
# This is 12h that is maximum allowed, Minimum is 3600 = 1h
# to take advantage of this use -T like in './prowler -A <ACCOUNT_ID_TO_ASSUME> -R ProwlerExecRole -T 43200 -M text,html'
# to take advantage of this use -T like in './prowler --role ProwlerScanRole.ARN -T 43200'
MaxSessionDuration: 43200
ManagedPolicyArns:
- 'arn:aws:iam::aws:policy/SecurityAudit'
- 'arn:aws:iam::aws:policy/job-function/ViewOnlyAccess'
RoleName: !Sub ${ProwlerRoleName}
Policies:
- PolicyName: ProwlerExecRoleAdditionalViewPrivileges
- PolicyName: ProwlerScanRoleAdditionalViewPrivileges
PolicyDocument:
Version : '2012-10-17'
Statement:
- Effect: Allow
Action:
- 'ds:ListAuthorizedApplications'
- 'account:Get*'
- 'appstream:Describe*'
- 'appstream:List*'
- 'codeartifact:List*'
- 'codebuild:BatchGet*'
- 'ds:Get*'
- 'ds:Describe*'
- 'ds:List*'
- 'ec2:GetEbsEncryptionByDefault'
- 'ecr:Describe*'
- 'elasticfilesystem:DescribeBackupPolicy'
- 'glue:GetConnections'
- 'glue:GetSecurityConfiguration'
- 'glue:GetSecurityConfiguration*'
- 'glue:SearchTables'
- 'lambda:GetFunction'
- 'lambda:GetFunction*'
- 'macie2:GetMacieSession'
- 's3:GetAccountPublicAccessBlock'
- 'shield:DescribeProtection'
- 'shield:GetSubscriptionState'
- 'securityhub:BatchImportFindings'
- 'securityhub:GetFindings'
- 'ssm:GetDocument'
- 'support:Describe*'
- 'tag:GetTagKeys'
Resource: '*'
- PolicyName: ProwlerScanRoleAdditionalViewPrivilegesApiGateway
PolicyDocument:
Version : '2012-10-17'
Statement:
- Effect: Allow
Action:
- 'apigateway:GET'
Resource: 'arn:aws:apigateway:*::/restapis/*'

View File

@@ -3,7 +3,9 @@
"Statement": [
{
"Action": [
"account:Get*",
"appstream:Describe*",
"appstream:List*",
"codeartifact:List*",
"codebuild:BatchGet*",
"ds:Describe*",

2641
poetry.lock generated Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -10,7 +10,6 @@ from prowler.lib.check.check import (
exclude_checks_to_run,
exclude_services_to_run,
execute_checks,
get_checks_from_input_arn,
list_categories,
list_services,
print_categories,
@@ -28,13 +27,16 @@ from prowler.lib.outputs.html import add_html_footer, fill_html_overview_statist
from prowler.lib.outputs.json import close_json
from prowler.lib.outputs.outputs import extract_findings_statistics, send_to_s3_bucket
from prowler.lib.outputs.summary_table import display_summary_table
from prowler.providers.aws.lib.allowlist.allowlist import parse_allowlist_file
from prowler.providers.aws.lib.quick_inventory.quick_inventory import quick_inventory
from prowler.providers.aws.lib.security_hub.security_hub import (
resolve_security_hub_previous_findings,
)
from prowler.providers.common.audit_info import set_provider_audit_info
from prowler.providers.common.allowlist import set_provider_allowlist
from prowler.providers.common.audit_info import (
set_provider_audit_info,
set_provider_execution_parameters,
)
from prowler.providers.common.outputs import set_provider_output_options
from prowler.providers.common.quick_inventory import run_provider_quick_inventory
def prowler():
@@ -79,26 +81,19 @@ def prowler():
# Load compliance frameworks
logger.debug("Loading compliance frameworks from .json files")
# Load the compliance framework if specified with --compliance
# If some compliance argument is specified we have to load it
if (
args.list_compliance
or args.list_compliance_requirements
or compliance_framework
):
bulk_compliance_frameworks = bulk_load_compliance_frameworks(provider)
# Complete checks metadata with the compliance framework specification
update_checks_metadata_with_compliance(
bulk_compliance_frameworks, bulk_checks_metadata
bulk_compliance_frameworks = bulk_load_compliance_frameworks(provider)
# Complete checks metadata with the compliance framework specification
update_checks_metadata_with_compliance(
bulk_compliance_frameworks, bulk_checks_metadata
)
if args.list_compliance:
print_compliance_frameworks(bulk_compliance_frameworks)
sys.exit()
if args.list_compliance_requirements:
print_compliance_requirements(
bulk_compliance_frameworks, args.list_compliance_requirements
)
if args.list_compliance:
print_compliance_frameworks(bulk_compliance_frameworks)
sys.exit()
if args.list_compliance_requirements:
print_compliance_requirements(
bulk_compliance_frameworks, args.list_compliance_requirements
)
sys.exit()
sys.exit()
# Load checks to execute
checks_to_execute = load_checks_to_execute(
@@ -134,26 +129,22 @@ def prowler():
# Set the audit info based on the selected provider
audit_info = set_provider_audit_info(provider, args.__dict__)
# Once the audit_info is set and we have the eventual checks from arn, it is time to exclude the others
# Once the audit_info is set and we have the eventual checks based on the resource identifier,
# it is time to check what Prowler's checks are going to be executed
if audit_info.audit_resources:
checks_to_execute = get_checks_from_input_arn(
audit_info.audit_resources, provider
)
checks_to_execute = set_provider_execution_parameters(provider, audit_info)
# Parse content from Allowlist file and get it, if necessary, from S3
if provider == "aws" and args.allowlist_file:
allowlist_file = parse_allowlist_file(audit_info, args.allowlist_file)
else:
allowlist_file = None
# Parse Allowlist
allowlist_file = set_provider_allowlist(provider, audit_info, args)
# Setting output options based on the selected provider
# Set output options based on the selected provider
audit_output_options = set_provider_output_options(
provider, args, audit_info, allowlist_file, bulk_checks_metadata
)
# Quick Inventory for AWS
if provider == "aws" and args.quick_inventory:
quick_inventory(audit_info, args.output_directory)
# Run the quick inventory for the provider if available
if hasattr(args, "quick_inventory") and args.quick_inventory:
run_provider_quick_inventory(provider, audit_info, args.output_directory)
sys.exit()
# Execute checks
@@ -203,7 +194,7 @@ def prowler():
)
# Resolve previous fails of Security Hub
if provider == "aws" and args.security_hub:
if provider == "aws" and args.security_hub and not args.skip_sh_update:
resolve_security_hub_previous_findings(args.output_directory, audit_info)
# Display summary table
@@ -216,14 +207,15 @@ def prowler():
)
if compliance_framework and findings:
# Display compliance table
display_compliance_table(
findings,
bulk_checks_metadata,
compliance_framework,
audit_output_options.output_filename,
audit_output_options.output_directory,
)
for compliance in compliance_framework:
# Display compliance table
display_compliance_table(
findings,
bulk_checks_metadata,
compliance,
audit_output_options.output_filename,
audit_output_options.output_directory,
)
# If there are failed findings exit code 3, except if -z is input
if not args.ignore_exit_code_3 and stats["total_fail"] > 0:

View File

@@ -533,6 +533,7 @@
"Id": "2.1.5",
"Description": "Ensure that S3 Buckets are configured with 'Block public access (bucket settings)'",
"Checks": [
"s3_bucket_level_public_access_block",
"s3_account_level_public_access_blocks"
],
"Attributes": [

View File

@@ -533,6 +533,7 @@
"Id": "2.1.5",
"Description": "Ensure that S3 Buckets are configured with 'Block public access (bucket settings)'",
"Checks": [
"s3_bucket_level_public_access_block",
"s3_account_level_public_access_blocks"
],
"Attributes": [

View File

@@ -1626,7 +1626,7 @@
}
],
"Checks": [
"ec2_securitygroup_in_use_without_ingress_filtering"
"ec2_securitygroup_allow_ingress_from_internet_to_any_port"
]
},
{

View File

@@ -1,15 +1,17 @@
import json
import os
import pathlib
from datetime import datetime, timezone
from os import getcwd
import requests
import yaml
from prowler.lib.logger import logger
timestamp = datetime.today()
timestamp_utc = datetime.now(timezone.utc).replace(tzinfo=timezone.utc)
prowler_version = "3.2.0"
prowler_version = "3.3.2"
html_logo_url = "https://github.com/prowler-cloud/prowler/"
html_logo_img = "https://user-images.githubusercontent.com/3985464/113734260-7ba06900-96fb-11eb-82bc-d4f68a1e2710.png"
@@ -17,29 +19,18 @@ orange_color = "\033[38;5;208m"
banner_color = "\033[1;92m"
# Compliance
compliance_specification_dir = "./compliance"
available_compliance_frameworks = [
"ens_rd2022_aws",
"cis_1.4_aws",
"cis_1.5_aws",
"aws_audit_manager_control_tower_guardrails_aws",
"aws_foundational_security_best_practices_aws",
"cisa_aws",
"fedramp_low_revision_4_aws",
"fedramp_moderate_revision_4_aws",
"ffiec_aws",
"gdpr_aws",
"gxp_eu_annex_11_aws",
"gxp_21_cfr_part_11_aws",
"hipaa_aws",
"nist_800_53_revision_4_aws",
"nist_800_53_revision_5_aws",
"nist_800_171_revision_2_aws",
"nist_csf_1.1_aws",
"pci_3.2.1_aws",
"rbi_cyber_security_framework_aws",
"soc2_aws",
]
actual_directory = pathlib.Path(os.path.dirname(os.path.realpath(__file__)))
compliance_aws_dir = f"{actual_directory}/../compliance/aws"
available_compliance_frameworks = []
with os.scandir(compliance_aws_dir) as files:
files = [
file.name
for file in files
if file.is_file()
and file.name.endswith(".json")
and available_compliance_frameworks.append(file.name.removesuffix(".json"))
]
# AWS services-regions matrix json
aws_services_json_file = "aws_regions_by_service.json"
@@ -54,6 +45,20 @@ html_file_suffix = ".html"
config_yaml = f"{pathlib.Path(os.path.dirname(os.path.realpath(__file__)))}/config.yaml"
def check_current_version(prowler_version):
try:
release_response = requests.get(
"https://api.github.com/repos/prowler-cloud/prowler/tags"
)
latest_version = json.loads(release_response)[0]["name"]
if latest_version != prowler_version:
return f"(latest is {latest_version}, upgrade for the latest features)"
else:
return "(it is the latest version, yay!)"
except Exception:
return ""
def change_config_var(variable, value):
try:
with open(config_yaml) as f:

View File

@@ -108,8 +108,8 @@ def exclude_services_to_run(
# Load checks from checklist.json
def parse_checks_from_file(input_file: str, provider: str) -> set:
checks_to_execute = set()
f = open_file(input_file)
json_file = parse_json_file(f)
with open_file(input_file) as f:
json_file = parse_json_file(f)
for check_name in json_file[provider]:
checks_to_execute.add(check_name)
@@ -122,7 +122,10 @@ def list_services(provider: str) -> set():
checks_tuple = recover_checks_from_provider(provider)
for _, check_path in checks_tuple:
# Format: /absolute_path/prowler/providers/{provider}/services/{service_name}/{check_name}
service_name = check_path.split("/")[-2]
if os.name == "nt":
service_name = check_path.split("\\")[-2]
else:
service_name = check_path.split("/")[-2]
available_services.add(service_name)
return sorted(available_services)
@@ -179,17 +182,18 @@ def print_compliance_requirements(
bulk_compliance_frameworks: dict, compliance_frameworks: list
):
for compliance_framework in compliance_frameworks:
for compliance in bulk_compliance_frameworks.values():
# Workaround until we have more Compliance Frameworks
split_compliance = compliance_framework.split("_")
framework = split_compliance[0].upper()
version = split_compliance[1].upper()
provider = split_compliance[2].upper()
if framework in compliance.Framework and compliance.Version == version:
for key in bulk_compliance_frameworks.keys():
framework = bulk_compliance_frameworks[key].Framework
provider = bulk_compliance_frameworks[key].Provider
version = bulk_compliance_frameworks[key].Version
requirements = bulk_compliance_frameworks[key].Requirements
# We can list the compliance requirements for a given framework using the
# bulk_compliance_frameworks keys since they are the compliance specification file name
if compliance_framework == key:
print(
f"Listing {framework} {version} {provider} Compliance Requirements:\n"
)
for requirement in compliance.Requirements:
for requirement in requirements:
checks = ""
for check in requirement.Checks:
checks += f" {Fore.YELLOW}\t\t{check}\n{Style.RESET_ALL}"
@@ -352,6 +356,22 @@ def execute_checks(
audit_progress=0,
)
if os.name != "nt":
try:
from resource import RLIMIT_NOFILE, getrlimit
# Check ulimit for the maximum system open files
soft, _ = getrlimit(RLIMIT_NOFILE)
if soft < 4096:
logger.warning(
f"Your session file descriptors limit ({soft} open files) is below 4096. We recommend to increase it to avoid errors. Solve it running this command `ulimit -n 4096`. For more info visit https://docs.prowler.cloud/en/latest/troubleshooting/"
)
except Exception as error:
logger.error("Unable to retrieve ulimit default settings")
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
# Execution with the --only-logs flag
if audit_output_options.only_logs:
for check_name in checks_to_execute:
@@ -503,26 +523,3 @@ def recover_checks_from_service(service_list: list, provider: str) -> list:
# if service_name in group_list: checks_from_arn.add(check_name)
checks.add(check_name)
return checks
def get_checks_from_input_arn(audit_resources: list, provider: str) -> set:
"""get_checks_from_input_arn gets the list of checks from the input arns"""
checks_from_arn = set()
# Handle if there are audit resources so only their services are executed
if audit_resources:
service_list = []
for resource in audit_resources:
service = resource.split(":")[2]
# Parse services when they are different in the ARNs
if service == "lambda":
service = "awslambda"
if service == "elasticloadbalancing":
service = "elb"
elif service == "logs":
service = "cloudwatch"
service_list.append(service)
checks_from_arn = recover_checks_from_service(service_list, provider)
# Return final checks list
return checks_from_arn

View File

@@ -1,10 +1,12 @@
import sys
from pydantic import parse_obj_as
from prowler.lib.check.compliance_models import (
Compliance_Base_Model,
Compliance_Requirement,
)
from prowler.lib.check.models import Check_Report_AWS
from prowler.lib.check.models import Check_Metadata_Model
from prowler.lib.logger import logger
@@ -62,44 +64,33 @@ def update_checks_metadata_with_compliance(
# Include the compliance framework for the check
check_compliance.append(compliance)
# Create metadata for Manual Control
manual_check_metadata = """{
"Provider" : "aws",
"CheckID" : "manual_check",
"CheckTitle" : "Manual Check",
"CheckType" : [],
"ServiceName" : "",
"SubServiceName" : "",
"ResourceIdTemplate" : "",
"Severity" : "",
"ResourceType" : "",
"Description" : "",
"Risk" : "",
"RelatedUrl" : "",
manual_check_metadata = {
"Provider": "aws",
"CheckID": "manual_check",
"CheckTitle": "Manual Check",
"CheckType": [],
"ServiceName": "",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "",
"ResourceType": "",
"Description": "",
"Risk": "",
"RelatedUrl": "",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "",
"Terraform": ""
},
"Recommendation": {
"Text": "",
"Url": ""
}
"Code": {"CLI": "", "NativeIaC": "", "Other": "", "Terraform": ""},
"Recommendation": {"Text": "", "Url": ""},
},
"Categories" : [],
"Tags" : {},
"DependsOn" : [],
"RelatedTo" : [],
"Notes" : ""
}"""
manual_check = Check_Report_AWS(manual_check_metadata)
manual_check.status = "INFO"
manual_check.status_extended = "Manual check"
manual_check.resource_id = "manual_check"
manual_check.Compliance = check_compliance
"Categories": [],
"Tags": {},
"DependsOn": [],
"RelatedTo": [],
"Notes": "",
}
manual_check = parse_obj_as(Check_Metadata_Model, manual_check_metadata)
# Save it into the check's metadata
bulk_checks_metadata["manual_check"] = manual_check
bulk_checks_metadata["manual_check"].Compliance = check_compliance
return bulk_checks_metadata
except Exception as e:

View File

@@ -2,7 +2,7 @@ import sys
from enum import Enum
from typing import Optional, Union
from pydantic import BaseModel, ValidationError
from pydantic import BaseModel, ValidationError, root_validator
from prowler.lib.logger import logger
@@ -107,10 +107,21 @@ class Compliance_Base_Model(BaseModel):
Framework: str
Provider: str
Version: str
Version: Optional[str]
Description: str
Requirements: list[Compliance_Requirement]
@root_validator(pre=True)
# noqa: F841 - since vulture raises unused variable 'cls'
def framework_and_provider_must_not_be_empty(cls, values): # noqa: F841
framework, provider = (
values.get("Framework"),
values.get("Provider"),
)
if framework == "" or provider == "":
raise ValueError("Framework or Provider must not be empty")
return values
# Testing Pending
def load_compliance_framework(

View File

@@ -48,7 +48,6 @@ class Check_Metadata_Model(BaseModel):
RelatedUrl: str
Remediation: Remediation
Categories: list[str]
Tags: dict
DependsOn: list[str]
RelatedTo: list[str]
Notes: str

View File

@@ -4,6 +4,7 @@ from argparse import RawTextHelpFormatter
from prowler.config.config import (
available_compliance_frameworks,
check_current_version,
default_output_directory,
prowler_version,
)
@@ -36,7 +37,7 @@ Detailed documentation at https://docs.prowler.cloud
"-v",
"--version",
action="version",
version=f"Prowler {prowler_version}",
version=f"Prowler {prowler_version} {check_current_version(prowler_version)}",
help="show Prowler version",
)
# Common arguments parser
@@ -316,6 +317,11 @@ Detailed documentation at https://docs.prowler.cloud
action="store_true",
help="Send check output to AWS Security Hub",
)
aws_security_hub_subparser.add_argument(
"--skip-sh-update",
action="store_true",
help="Skip updating previous findings of Prowler in Security Hub",
)
# AWS Quick Inventory
aws_quick_inventory_subparser = aws_parser.add_argument_group("Quick Inventory")
aws_quick_inventory_subparser.add_argument(

View File

@@ -4,7 +4,8 @@ from csv import DictWriter
from colorama import Fore, Style
from tabulate import tabulate
from prowler.config.config import timestamp, orange_color
from prowler.config.config import orange_color, timestamp
from prowler.lib.check.models import Check_Report
from prowler.lib.logger import logger
from prowler.lib.outputs.models import (
Check_Output_CSV_CIS,
@@ -18,7 +19,13 @@ def add_manual_controls(output_options, audit_info, file_descriptors):
try:
# Check if MANUAL control was already added to output
if "manual_check" in output_options.bulk_checks_metadata:
manual_finding = output_options.bulk_checks_metadata["manual_check"]
manual_finding = Check_Report(
output_options.bulk_checks_metadata["manual_check"].json()
)
manual_finding.status = "INFO"
manual_finding.status_extended = "Manual check"
manual_finding.resource_id = "manual_check"
manual_finding.region = ""
fill_compliance(
output_options, manual_finding, audit_info, file_descriptors
)
@@ -167,7 +174,7 @@ def display_compliance_table(
output_directory: str,
):
try:
if "ens_rd2022_aws" in compliance_framework:
if "ens_rd2022_aws" == compliance_framework:
marcos = {}
ens_compliance_table = {
"Proveedor": [],
@@ -266,9 +273,9 @@ def display_compliance_table(
)
print(f"\nResultados detallados de {compliance_fm} en:")
print(
f" - CSV: {output_directory}/{output_filename}_{compliance_framework[0]}.csv\n"
f" - CSV: {output_directory}/{output_filename}_{compliance_framework}.csv\n"
)
elif "cis_1." in str(compliance_framework):
elif "cis_1." in compliance_framework:
sections = {}
cis_compliance_table = {
"Provider": [],
@@ -281,8 +288,9 @@ def display_compliance_table(
check = bulk_checks_metadata[finding.check_metadata.CheckID]
check_compliances = check.Compliance
for compliance in check_compliances:
if compliance.Framework == "CIS" and compliance.Version in str(
compliance_framework
if (
compliance.Framework == "CIS"
and compliance.Version in compliance_framework
):
compliance_version = compliance.Version
compliance_fm = compliance.Framework
@@ -360,12 +368,12 @@ def display_compliance_table(
)
print(f"\nDetailed results of {compliance_fm} are in:")
print(
f" - CSV: {output_directory}/{output_filename}_{compliance_framework[0]}.csv\n"
f" - CSV: {output_directory}/{output_filename}_{compliance_framework}.csv\n"
)
else:
print(f"\nDetailed results of {compliance_framework[0].upper()} are in:")
print(f"\nDetailed results of {compliance_framework.upper()} are in:")
print(
f" - CSV: {output_directory}/{output_filename}_{compliance_framework[0]}.csv\n"
f" - CSV: {output_directory}/{output_filename}_{compliance_framework}.csv\n"
)
except Exception as error:
logger.critical(

View File

@@ -9,6 +9,12 @@ from prowler.config.config import (
timestamp,
)
from prowler.lib.logger import logger
from prowler.lib.outputs.models import (
get_check_compliance,
parse_html_string,
unroll_dict,
unroll_tags,
)
from prowler.lib.utils.utils import open_file
@@ -183,16 +189,16 @@ def add_html_header(file_descriptor, audit_info):
<tr>
<th scope="col">Status</th>
<th scope="col">Severity</th>
<th style="width:5%" scope="col">Service Name</th>
<th scope="col">Service Name</th>
<th scope="col">Region</th>
<th style="width:20%" scope="col">Check ID</th>
<th style="width:20%" scope="col">Check Title</th>
<th scope="col">Resource ID</th>
<th style="width:15%" scope="col">Check Description</th>
<th scope="col">Check ID</th>
<th scope="col">Resource Tags</th>
<th scope="col">Status Extended</th>
<th scope="col">Risk</th>
<th scope="col">Recomendation</th>
<th style="width:5%" scope="col">Recomendation URL</th>
<th scope="col">Compliance</th>
</tr>
</thead>
<tbody>
@@ -204,7 +210,7 @@ def add_html_header(file_descriptor, audit_info):
)
def fill_html(file_descriptor, finding):
def fill_html(file_descriptor, finding, output_options):
row_class = "p-3 mb-2 bg-success-custom"
if finding.status == "INFO":
row_class = "table-info"
@@ -219,14 +225,14 @@ def fill_html(file_descriptor, finding):
<td>{finding.check_metadata.Severity}</td>
<td>{finding.check_metadata.ServiceName}</td>
<td>{finding.region}</td>
<td>{finding.check_metadata.CheckID.replace("_", "<wbr>_")}</td>
<td>{finding.check_metadata.CheckTitle}</td>
<td>{finding.resource_id.replace("<", "&lt;").replace(">", "&gt;").replace("_", "<wbr>_")}</td>
<td>{finding.check_metadata.Description}</td>
<td>{finding.check_metadata.CheckID.replace("_", "<wbr>_")}</td>
<td>{parse_html_string(unroll_tags(finding.resource_tags))}</td>
<td>{finding.status_extended.replace("<", "&lt;").replace(">", "&gt;").replace("_", "<wbr>_")}</td>
<td><p class="show-read-more">{finding.check_metadata.Risk}</p></td>
<td><p class="show-read-more">{finding.check_metadata.Remediation.Recommendation.Text}</p></td>
<td><a class="read-more" href="{finding.check_metadata.Remediation.Recommendation.Url}"><i class="fas fa-external-link-alt"></i></a></td>
<td><p class="show-read-more">{finding.check_metadata.Remediation.Recommendation.Text}</p> <a class="read-more" href="{finding.check_metadata.Remediation.Recommendation.Url}"><i class="fas fa-external-link-alt"></i></a></td>
<td><p class="show-read-more">{parse_html_string(unroll_dict(get_check_compliance(finding, finding.check_metadata.Provider, output_options)))}</p></td>
</tr>
"""
)

View File

@@ -8,11 +8,17 @@ from prowler.config.config import (
timestamp_utc,
)
from prowler.lib.logger import logger
from prowler.lib.outputs.models import Compliance, ProductFields, Resource, Severity
from prowler.lib.outputs.models import (
Compliance,
ProductFields,
Resource,
Severity,
get_check_compliance,
)
from prowler.lib.utils.utils import hash_sha512, open_file
def fill_json_asff(finding_output, audit_info, finding):
def fill_json_asff(finding_output, audit_info, finding, output_options):
# Check if there are no resources in the finding
if finding.resource_arn == "":
if finding.resource_id == "":
@@ -31,7 +37,7 @@ def fill_json_asff(finding_output, audit_info, finding):
) = finding_output.CreatedAt = timestamp_utc.strftime("%Y-%m-%dT%H:%M:%SZ")
finding_output.Severity = Severity(Label=finding.check_metadata.Severity.upper())
finding_output.Title = finding.check_metadata.CheckTitle
finding_output.Description = finding.check_metadata.Description
finding_output.Description = finding.status_extended
finding_output.Resources = [
Resource(
Id=finding.resource_arn,
@@ -40,14 +46,22 @@ def fill_json_asff(finding_output, audit_info, finding):
Region=finding.region,
)
]
# Check if any Requirement has > 64 characters
check_types = []
for type in finding.check_metadata.CheckType:
check_types.extend(type.split("/"))
# Iterate for each compliance framework
compliance_summary = []
associated_standards = []
check_compliance = get_check_compliance(finding, "aws", output_options)
for key, value in check_compliance.items():
associated_standards.append({"StandardsId": key})
item = f"{key} {' '.join(value)}"
if len(item) > 64:
item = item[0:63]
compliance_summary.append(item)
# Add ED to PASS or FAIL (PASSED/FAILED)
finding_output.Compliance = Compliance(
Status=finding.status + "ED",
RelatedRequirements=check_types,
AssociatedStandards=associated_standards,
RelatedRequirements=compliance_summary,
)
finding_output.Remediation = {
"Recommendation": finding.check_metadata.Remediation.Recommendation

View File

@@ -11,7 +11,26 @@ from prowler.lib.logger import logger
from prowler.providers.aws.lib.audit_info.models import AWS_Organizations_Info
def generate_provider_output_csv(provider: str, finding, audit_info, mode: str, fd):
def get_check_compliance(finding, provider, output_options):
check_compliance = {}
# We have to retrieve all the check's compliance requirements
for compliance in output_options.bulk_checks_metadata[
finding.check_metadata.CheckID
].Compliance:
compliance_fw = compliance.Framework
if compliance.Version:
compliance_fw = f"{compliance_fw}-{compliance.Version}"
if compliance.Provider == provider.upper():
if compliance_fw not in check_compliance:
check_compliance[compliance_fw] = []
for requirement in compliance.Requirements:
check_compliance[compliance_fw].append(requirement.Id)
return check_compliance
def generate_provider_output_csv(
provider: str, finding, audit_info, mode: str, fd, output_options
):
"""
set_provider_output_options configures automatically the outputs based on the selected provider and returns the Provider_Output_Options object.
"""
@@ -32,6 +51,9 @@ def generate_provider_output_csv(provider: str, finding, audit_info, mode: str,
data[
"finding_unique_id"
] = f"prowler-{provider}-{finding.check_metadata.CheckID}-{finding.subscription}-{finding.resource_id}"
data["compliance"] = unroll_dict(
get_check_compliance(finding, provider, output_options)
)
finding_output = output_model(**data)
if provider == "aws":
@@ -43,6 +65,9 @@ def generate_provider_output_csv(provider: str, finding, audit_info, mode: str,
data[
"finding_unique_id"
] = f"prowler-{provider}-{finding.check_metadata.CheckID}-{audit_info.audited_account}-{finding.region}-{finding.resource_id}"
data["compliance"] = unroll_dict(
get_check_compliance(finding, provider, output_options)
)
finding_output = output_model(**data)
if audit_info.organizations_metadata:
@@ -91,7 +116,7 @@ def fill_common_data_csv(finding: dict) -> dict:
"severity": finding.check_metadata.Severity,
"resource_type": finding.check_metadata.ResourceType,
"resource_details": finding.resource_details,
"resource_tags": finding.resource_tags,
"resource_tags": unroll_tags(finding.resource_tags),
"description": finding.check_metadata.Description,
"risk": finding.check_metadata.Risk,
"related_url": finding.check_metadata.RelatedUrl,
@@ -113,26 +138,99 @@ def fill_common_data_csv(finding: dict) -> dict:
"remediation_recommendation_code_other": (
finding.check_metadata.Remediation.Code.Other
),
"categories": __unroll_list__(finding.check_metadata.Categories),
"depends_on": __unroll_list__(finding.check_metadata.DependsOn),
"related_to": __unroll_list__(finding.check_metadata.RelatedTo),
"categories": unroll_list(finding.check_metadata.Categories),
"depends_on": unroll_list(finding.check_metadata.DependsOn),
"related_to": unroll_list(finding.check_metadata.RelatedTo),
"notes": finding.check_metadata.Notes,
}
return data
def __unroll_list__(listed_items: list):
def unroll_list(listed_items: list):
unrolled_items = ""
separator = "|"
for item in listed_items:
if not unrolled_items:
unrolled_items = f"{item}"
else:
unrolled_items = f"{unrolled_items}{separator}{item}"
if listed_items:
for item in listed_items:
if not unrolled_items:
unrolled_items = f"{item}"
else:
unrolled_items = f"{unrolled_items} {separator} {item}"
return unrolled_items
def unroll_tags(tags: list):
unrolled_items = ""
separator = "|"
if tags:
for item in tags:
# Check if there are tags in list
if type(item) == dict:
for key, value in item.items():
if not unrolled_items:
# Check the pattern of tags (Key:Value or Key:key/Value:value)
if "Key" != key and "Value" != key:
unrolled_items = f"{key}={value}"
else:
if "Key" == key:
unrolled_items = f"{value}="
else:
unrolled_items = f"{value}"
else:
if "Key" != key and "Value" != key:
unrolled_items = (
f"{unrolled_items} {separator} {key}={value}"
)
else:
if "Key" == key:
unrolled_items = (
f"{unrolled_items} {separator} {value}="
)
else:
unrolled_items = f"{unrolled_items}{value}"
elif not unrolled_items:
unrolled_items = f"{item}"
else:
unrolled_items = f"{unrolled_items} {separator} {item}"
return unrolled_items
def unroll_dict(dict: dict):
unrolled_items = ""
separator = "|"
for key, value in dict.items():
if type(value) == list:
value = ", ".join(value)
if not unrolled_items:
unrolled_items = f"{key}: {value}"
else:
unrolled_items = f"{unrolled_items} {separator} {key}: {value}"
return unrolled_items
def parse_html_string(str: str):
string = ""
for elem in str.split(" | "):
if elem:
string += f"\n&#x2022;{elem}\n"
return string
def parse_json_tags(tags: list):
dict_tags = {}
if tags and tags != [{}] and tags != [None]:
for tag in tags:
if "Key" in tag and "Value" in tag:
dict_tags[tag["Key"]] = tag["Value"]
else:
dict_tags.update(tag)
return dict_tags
def generate_csv_fields(format: Any) -> list[str]:
"""Generates the CSV headers for the given class"""
csv_fields = []
@@ -162,7 +260,7 @@ class Check_Output_CSV(BaseModel):
severity: str
resource_type: str
resource_details: str
resource_tags: list
resource_tags: str
description: str
risk: str
related_url: str
@@ -172,6 +270,7 @@ class Check_Output_CSV(BaseModel):
remediation_recommendation_code_terraform: str
remediation_recommendation_code_cli: str
remediation_recommendation_code_other: str
compliance: str
categories: str
depends_on: str
related_to: str
@@ -206,7 +305,9 @@ class Azure_Check_Output_CSV(Check_Output_CSV):
resource_name: str = ""
def generate_provider_output_json(provider: str, finding, audit_info, mode: str, fd):
def generate_provider_output_json(
provider: str, finding, audit_info, mode: str, output_options
):
"""
generate_provider_output_json configures automatically the outputs based on the selected provider and returns the Check_Output_JSON object.
"""
@@ -228,6 +329,9 @@ def generate_provider_output_json(provider: str, finding, audit_info, mode: str,
finding_output.ResourceId = finding.resource_id
finding_output.ResourceName = finding.resource_name
finding_output.FindingUniqueId = f"prowler-{provider}-{finding.check_metadata.CheckID}-{finding.subscription}-{finding.resource_id}"
finding_output.Compliance = get_check_compliance(
finding, provider, output_options
)
if provider == "aws":
finding_output.Profile = audit_info.profile
@@ -235,7 +339,11 @@ def generate_provider_output_json(provider: str, finding, audit_info, mode: str,
finding_output.Region = finding.region
finding_output.ResourceId = finding.resource_id
finding_output.ResourceArn = finding.resource_arn
finding_output.ResourceTags = parse_json_tags(finding.resource_tags)
finding_output.FindingUniqueId = f"prowler-{provider}-{finding.check_metadata.CheckID}-{audit_info.audited_account}-{finding.region}-{finding.resource_id}"
finding_output.Compliance = get_check_compliance(
finding, provider, output_options
)
if audit_info.organizations_metadata:
finding_output.OrganizationsInfo = (
@@ -271,11 +379,11 @@ class Check_Output_JSON(BaseModel):
Severity: str
ResourceType: str
ResourceDetails: str = ""
Tags: dict
Description: str
Risk: str
RelatedUrl: str
Remediation: Remediation
Compliance: Optional[dict]
Categories: List[str]
DependsOn: List[str]
RelatedTo: List[str]
@@ -293,6 +401,7 @@ class Aws_Check_Output_JSON(Check_Output_JSON):
Region: str = ""
ResourceId: str = ""
ResourceArn: str = ""
ResourceTags: list = []
def __init__(self, **metadata):
super().__init__(**metadata)
@@ -300,7 +409,7 @@ class Aws_Check_Output_JSON(Check_Output_JSON):
class Azure_Check_Output_JSON(Check_Output_JSON):
"""
Aws_Check_Output_JSON generates a finding's output in JSON format for the AWS provider.
Azure_Check_Output_JSON generates a finding's output in JSON format for the AWS provider.
"""
Tenant_Domain: str = ""
@@ -409,6 +518,7 @@ class Resource(BaseModel):
class Compliance(BaseModel):
Status: str
RelatedRequirements: List[str]
AssociatedStandards: List[dict]
class Check_Output_JSON_ASFF(BaseModel):

View File

@@ -101,12 +101,16 @@ def report(check_findings, output_options, audit_info):
)
if "html" in file_descriptors:
fill_html(file_descriptors["html"], finding)
fill_html(
file_descriptors["html"], finding, output_options
)
file_descriptors["html"].write("")
if "json-asff" in file_descriptors:
finding_output = Check_Output_JSON_ASFF()
fill_json_asff(finding_output, audit_info, finding)
fill_json_asff(
finding_output, audit_info, finding, output_options
)
json.dump(
finding_output.dict(),
@@ -136,6 +140,7 @@ def report(check_findings, output_options, audit_info):
audit_info,
"csv",
file_descriptors["csv"],
output_options,
)
csv_writer.writerow(finding_output.__dict__)
@@ -145,7 +150,7 @@ def report(check_findings, output_options, audit_info):
finding,
audit_info,
"json",
file_descriptors["json"],
output_options,
)
json.dump(
finding_output.dict(),
@@ -203,6 +208,8 @@ def send_to_s3_bucket(
filename = f"{output_filename}{json_asff_file_suffix}"
elif output_mode == "html":
filename = f"{output_filename}{html_file_suffix}"
else: # Compliance output mode
filename = f"{output_filename}_{output_mode}{csv_file_suffix}"
logger.info(f"Sending outputs to S3 bucket {output_bucket}")
bucket_remote_dir = output_directory
while "prowler/" in bucket_remote_dir: # Check if it is not a custom directory

View File

@@ -1,16 +1,26 @@
import json
import os
import sys
import tempfile
from hashlib import sha512
from io import TextIOWrapper
from os.path import exists
from typing import Any
from detect_secrets import SecretsCollection
from detect_secrets.settings import default_settings
from prowler.lib.logger import logger
def open_file(input_file: str, mode: str = "r") -> TextIOWrapper:
try:
f = open(input_file, mode)
except OSError:
logger.critical(
"Ooops! You reached your user session maximum open files. To solve this issue, increase the shell session limit by running this command `ulimit -n 4096`. For more info visit https://docs.prowler.cloud/en/latest/troubleshooting/"
)
sys.exit(1)
except Exception as e:
logger.critical(
f"{input_file}: {e.__class__.__name__}[{e.__traceback__.tb_lineno}]"
@@ -49,3 +59,20 @@ def file_exists(filename: str):
# create sha512 hash for string
def hash_sha512(string: str) -> str:
return sha512(string.encode("utf-8")).hexdigest()[0:9]
def detect_secrets_scan(data):
temp_data_file = tempfile.NamedTemporaryFile(delete=False)
temp_data_file.write(bytes(data, encoding="raw_unicode_escape"))
temp_data_file.close()
secrets = SecretsCollection()
with default_settings():
secrets.scan_file(temp_data_file.name)
os.remove(temp_data_file.name)
detect_secrets_output = secrets.json()
if detect_secrets_output:
return detect_secrets_output[temp_data_file.name]
else:
return None

View File

@@ -7,6 +7,7 @@ from botocore.credentials import RefreshableCredentials
from botocore.session import get_session
from prowler.config.config import aws_services_json_file
from prowler.lib.check.check import list_modules, recover_checks_from_service
from prowler.lib.logger import logger
from prowler.lib.utils.utils import open_file, parse_json_file
from prowler.providers.aws.lib.audit_info.models import AWS_Assume_Role, AWS_Audit_Info
@@ -84,7 +85,7 @@ def assume_role(session: session.Session, assumed_role_info: AWS_Assume_Role) ->
if assumed_role_info.external_id:
assumed_credentials = sts_client.assume_role(
RoleArn=assumed_role_info.role_arn,
RoleSessionName="ProwlerProAsessmentSession",
RoleSessionName="ProwlerAsessmentSession",
DurationSeconds=assumed_role_info.session_duration,
ExternalId=assumed_role_info.external_id,
)
@@ -110,8 +111,8 @@ def generate_regional_clients(
regional_clients = {}
# Get json locally
actual_directory = pathlib.Path(os.path.dirname(os.path.realpath(__file__)))
f = open_file(f"{actual_directory}/{aws_services_json_file}")
data = parse_json_file(f)
with open_file(f"{actual_directory}/{aws_services_json_file}") as f:
data = parse_json_file(f)
# Check if it is a subservice
json_regions = data["services"][service]["regions"][
audit_info.audited_partition
@@ -144,8 +145,8 @@ def generate_regional_clients(
def get_aws_available_regions():
try:
actual_directory = pathlib.Path(os.path.dirname(os.path.realpath(__file__)))
f = open_file(f"{actual_directory}/{aws_services_json_file}")
data = parse_json_file(f)
with open_file(f"{actual_directory}/{aws_services_json_file}") as f:
data = parse_json_file(f)
regions = set()
for service in data["services"].values():
@@ -156,3 +157,73 @@ def get_aws_available_regions():
except Exception as error:
logger.error(f"{error.__class__.__name__}: {error}")
return []
def get_checks_from_input_arn(audit_resources: list, provider: str) -> set:
"""get_checks_from_input_arn gets the list of checks from the input arns"""
checks_from_arn = set()
# Handle if there are audit resources so only their services are executed
if audit_resources:
services_without_subservices = ["guardduty", "kms", "s3", "elb"]
service_list = set()
sub_service_list = set()
for resource in audit_resources:
service = resource.split(":")[2]
sub_service = resource.split(":")[5].split("/")[0].replace("-", "_")
# WAF Services does not have checks
if service != "wafv2" and service != "waf":
# Parse services when they are different in the ARNs
if service == "lambda":
service = "awslambda"
if service == "elasticloadbalancing":
service = "elb"
elif service == "logs":
service = "cloudwatch"
# Check if Prowler has checks in service
try:
list_modules(provider, service)
except ModuleNotFoundError:
# Service is not supported
pass
else:
service_list.add(service)
# Get subservices to execute only applicable checks
if service not in services_without_subservices:
# Parse some specific subservices
if service == "ec2":
if sub_service == "security_group":
sub_service = "securitygroup"
if sub_service == "network_acl":
sub_service = "networkacl"
if sub_service == "image":
sub_service = "ami"
if service == "rds":
if sub_service == "cluster_snapshot":
sub_service = "snapshot"
sub_service_list.add(sub_service)
else:
sub_service_list.add(service)
checks = recover_checks_from_service(service_list, provider)
# Filter only checks with audited subservices
for check in checks:
if any(sub_service in check for sub_service in sub_service_list):
if not (sub_service == "policy" and "password_policy" in check):
checks_from_arn.add(check)
# Return final checks list
return sorted(checks_from_arn)
def get_regions_from_audit_resources(audit_resources: list) -> list:
"""get_regions_from_audit_resources gets the regions from the audit resources arns"""
audited_regions = []
for resource in audit_resources:
region = resource.split(":")[3]
if region and region not in audited_regions: # Check if arn has a region
audited_regions.append(region)
if audited_regions:
return audited_regions
return None

View File

@@ -457,23 +457,29 @@
"ap-east-1",
"ap-northeast-1",
"ap-south-1",
"ap-south-2",
"ap-southeast-1",
"eu-west-1",
"eu-west-2",
"me-south-1",
"us-east-2",
"us-west-2",
"af-south-1",
"ap-northeast-2",
"ap-northeast-3",
"ap-southeast-4",
"ca-central-1",
"eu-south-1",
"eu-central-2",
"me-central-1",
"sa-east-1",
"us-east-1",
"us-west-1",
"us-west-2",
"af-south-1",
"ap-southeast-2",
"ap-southeast-3",
"eu-central-1",
"eu-north-1",
"eu-south-1",
"eu-south-2",
"us-east-1",
"eu-west-3"
],
"aws-cn": [
@@ -481,8 +487,8 @@
"cn-northwest-1"
],
"aws-us-gov": [
"us-gov-west-1",
"us-gov-east-1"
"us-gov-east-1",
"us-gov-west-1"
]
}
},
@@ -646,6 +652,9 @@
"regions": {
"aws": [
"ap-northeast-1",
"ap-southeast-1",
"ap-southeast-2",
"eu-central-1",
"eu-west-1",
"us-east-1",
"us-east-2",
@@ -733,15 +742,17 @@
"regions": {
"aws": [
"ap-northeast-1",
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"eu-central-1",
"eu-north-1",
"eu-west-1",
"us-east-1",
"us-east-2",
"us-west-2"
"us-west-2",
"ap-southeast-3",
"eu-west-2"
],
"aws-cn": [],
"aws-us-gov": []
@@ -1630,6 +1641,13 @@
]
}
},
"cloudtrail-data": {
"regions": {
"aws": [],
"aws-cn": [],
"aws-us-gov": []
}
},
"cloudwatch": {
"regions": {
"aws": [
@@ -1751,16 +1769,19 @@
"us-east-1",
"us-east-2",
"ap-east-1",
"ap-northeast-2",
"ap-south-1",
"ap-south-2",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"eu-central-1",
"eu-north-1",
"eu-south-1",
"me-central-1",
"sa-east-1",
"us-west-1",
"ap-northeast-1",
"ap-northeast-2",
"ap-south-1",
"eu-north-1",
"us-west-2"
],
"aws-cn": [
@@ -2196,6 +2217,11 @@
"connectcases": {
"regions": {
"aws": [
"ap-southeast-1",
"ap-southeast-2",
"ca-central-1",
"eu-central-1",
"eu-west-2",
"us-east-1",
"us-west-2"
],
@@ -2572,17 +2598,17 @@
"af-south-1",
"ap-east-1",
"ap-northeast-1",
"ap-northeast-3",
"ap-southeast-1",
"ca-central-1",
"eu-south-2",
"eu-west-1",
"eu-west-3",
"us-east-1",
"ap-northeast-2",
"ap-northeast-3",
"ap-south-1",
"ap-southeast-4",
"eu-west-2",
"me-south-1",
"eu-central-2",
"sa-east-1",
"us-east-2",
"us-west-2",
@@ -2591,7 +2617,9 @@
"eu-central-1",
"eu-north-1",
"eu-south-1",
"eu-west-2",
"me-central-1",
"me-south-1",
"us-west-1"
],
"aws-cn": [
@@ -3003,6 +3031,7 @@
"regions": {
"aws": [
"ap-south-1",
"ap-south-2",
"ca-central-1",
"eu-west-1",
"eu-west-2",
@@ -3012,7 +3041,6 @@
"af-south-1",
"ap-northeast-1",
"ap-northeast-2",
"ap-northeast-3",
"eu-central-2",
"eu-south-1",
"eu-south-2",
@@ -3020,6 +3048,7 @@
"me-central-1",
"sa-east-1",
"ap-east-1",
"ap-northeast-3",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
@@ -3029,8 +3058,8 @@
"us-west-2"
],
"aws-cn": [
"cn-north-1",
"cn-northwest-1"
"cn-northwest-1",
"cn-north-1"
],
"aws-us-gov": [
"us-gov-east-1",
@@ -3060,13 +3089,14 @@
"us-west-2",
"ap-northeast-1",
"ap-south-1",
"ap-south-2",
"ap-southeast-1",
"ap-southeast-4",
"ca-central-1",
"eu-central-2",
"eu-west-3",
"me-central-1",
"us-east-1"
"us-east-1",
"ap-south-2"
],
"aws-cn": [
"cn-northwest-1",
@@ -3319,20 +3349,23 @@
"sa-east-1",
"us-east-1",
"us-east-2",
"us-west-2",
"ap-northeast-1",
"ap-northeast-2",
"ap-southeast-2",
"ca-central-1",
"eu-central-1",
"eu-west-2",
"us-west-1"
"us-west-1",
"us-west-2"
],
"aws-cn": [
"cn-northwest-1",
"cn-north-1"
],
"aws-us-gov": []
"aws-us-gov": [
"us-gov-east-1",
"us-gov-west-1"
]
}
},
"emr-serverless": {
@@ -3595,9 +3628,9 @@
"ap-southeast-3",
"eu-central-2",
"eu-west-3",
"af-south-1",
"ap-east-1",
"ap-southeast-2",
"ap-southeast-4",
"eu-central-1",
"eu-north-1",
"eu-west-1",
@@ -3605,14 +3638,15 @@
"me-central-1",
"us-east-2",
"us-west-1",
"af-south-1",
"ap-southeast-1",
"ca-central-1",
"eu-south-1",
"eu-south-2",
"me-south-1",
"sa-east-1",
"us-east-1",
"us-west-2"
"us-west-2",
"sa-east-1"
],
"aws-cn": [
"cn-northwest-1",
@@ -3864,9 +3898,10 @@
"eu-south-1",
"eu-west-2",
"eu-west-3",
"sa-east-1",
"me-central-1",
"us-east-1",
"eu-north-1"
"eu-north-1",
"sa-east-1"
],
"aws-cn": [],
"aws-us-gov": [
@@ -4155,12 +4190,12 @@
"us-west-2",
"ap-east-1",
"ap-northeast-1",
"ap-south-2",
"ap-southeast-1",
"ap-southeast-2",
"ca-central-1",
"eu-south-1",
"eu-west-1",
"me-central-1",
"me-south-1",
"us-east-2",
"ap-northeast-2",
@@ -4168,6 +4203,7 @@
"eu-central-2",
"eu-north-1",
"eu-west-3",
"me-central-1",
"us-west-1"
],
"aws-cn": [
@@ -4414,6 +4450,34 @@
]
}
},
"internetmonitor": {
"regions": {
"aws": [
"af-south-1",
"ap-northeast-1",
"ap-south-1",
"ap-southeast-1",
"eu-central-1",
"eu-south-1",
"eu-west-1",
"eu-west-2",
"eu-west-3",
"sa-east-1",
"ap-east-1",
"ap-northeast-2",
"ap-southeast-2",
"ca-central-1",
"eu-north-1",
"me-south-1",
"us-east-1",
"us-east-2",
"us-west-1",
"us-west-2"
],
"aws-cn": [],
"aws-us-gov": []
}
},
"iot": {
"regions": {
"aws": [
@@ -4753,7 +4817,9 @@
"us-west-2"
],
"aws-cn": [],
"aws-us-gov": []
"aws-us-gov": [
"us-gov-west-1"
]
}
},
"iotwireless": {
@@ -4809,7 +4875,7 @@
"ap-southeast-2",
"ap-southeast-3",
"eu-west-3",
"us-east-1",
"me-central-1",
"us-west-2",
"ap-east-1",
"ap-northeast-1",
@@ -4819,6 +4885,7 @@
"eu-south-1",
"eu-west-2",
"me-south-1",
"us-east-1",
"us-east-2",
"ap-southeast-1",
"ca-central-1",
@@ -4883,6 +4950,8 @@
"kendra-ranking": {
"regions": {
"aws": [
"ap-northeast-1",
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"ca-central-1",
@@ -4953,14 +5022,15 @@
"eu-north-1",
"eu-south-1",
"eu-west-1",
"eu-west-3",
"me-central-1",
"us-east-1",
"us-west-1",
"af-south-1",
"ap-northeast-3",
"ap-south-1",
"eu-central-1",
"eu-west-2"
"eu-west-2",
"eu-west-3"
],
"aws-cn": [
"cn-north-1",
@@ -5043,8 +5113,8 @@
"regions": {
"aws": [
"af-south-1",
"ap-northeast-2",
"ap-southeast-1",
"ap-southeast-3",
"ca-central-1",
"eu-central-1",
"eu-north-1",
@@ -5053,14 +5123,15 @@
"us-west-1",
"ap-east-1",
"ap-northeast-1",
"ap-northeast-2",
"ap-northeast-3",
"eu-west-1",
"eu-west-2",
"eu-west-3",
"me-south-1",
"us-west-2",
"ap-south-1",
"ap-southeast-2",
"eu-west-2",
"sa-east-1",
"us-east-2"
],
@@ -5115,6 +5186,38 @@
]
}
},
"launchwizard": {
"regions": {
"aws": [
"af-south-1",
"ap-east-1",
"ap-northeast-1",
"ap-northeast-3",
"eu-central-1",
"eu-north-1",
"eu-west-2",
"us-east-2",
"us-west-1",
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"eu-south-1",
"eu-west-1",
"eu-west-3",
"me-south-1",
"sa-east-1",
"us-west-2",
"ap-northeast-2",
"ca-central-1",
"us-east-1"
],
"aws-cn": [],
"aws-us-gov": [
"us-gov-west-1",
"us-gov-east-1"
]
}
},
"lex-models": {
"regions": {
"aws": [
@@ -5909,6 +6012,7 @@
"sa-east-1",
"us-east-1",
"eu-central-1",
"me-central-1",
"us-west-2"
],
"aws-cn": [],
@@ -6096,9 +6200,10 @@
"eu-south-1",
"eu-west-1",
"eu-west-2",
"us-east-1",
"me-central-1",
"eu-central-1",
"sa-east-1",
"us-east-1",
"us-west-2"
],
"aws-cn": [],
@@ -6175,7 +6280,6 @@
"eu-central-2",
"eu-south-1",
"me-south-1",
"sa-east-1",
"us-east-2",
"ap-northeast-1",
"ap-south-2",
@@ -6183,9 +6287,13 @@
"ap-southeast-4",
"eu-north-1",
"eu-west-3",
"me-central-1"
"me-central-1",
"sa-east-1"
],
"aws-cn": [
"cn-north-1",
"cn-northwest-1"
],
"aws-cn": [],
"aws-us-gov": []
}
},
@@ -6823,13 +6931,14 @@
"us-west-1",
"ap-east-1",
"ap-northeast-1",
"ap-southeast-4",
"ca-central-1",
"eu-central-1",
"eu-west-2",
"me-central-1",
"me-south-1",
"sa-east-1",
"us-west-2"
"us-west-2",
"eu-central-1"
],
"aws-cn": [
"cn-north-1",
@@ -7010,13 +7119,17 @@
"ap-east-1",
"ap-southeast-2",
"ca-central-1",
"eu-central-1",
"eu-north-1",
"eu-west-3",
"me-central-1",
"sa-east-1",
"us-west-1",
"ap-northeast-1",
"ap-south-1",
"ap-south-2",
"eu-central-1",
"eu-central-2",
"eu-south-2",
"eu-west-2",
"us-east-1",
"us-east-2"
@@ -7215,24 +7328,27 @@
"ap-south-1",
"ap-southeast-3",
"ca-central-1",
"eu-west-2",
"me-south-1",
"us-east-1",
"us-east-2",
"ap-northeast-1",
"ap-northeast-2",
"ap-southeast-1",
"ap-southeast-2",
"eu-central-1",
"eu-north-1",
"eu-south-1",
"eu-west-1",
"eu-west-3",
"eu-west-2",
"us-west-1",
"ap-southeast-2",
"eu-west-3",
"sa-east-1",
"us-west-2"
],
"aws-cn": [],
"aws-cn": [
"cn-northwest-1",
"cn-north-1"
],
"aws-us-gov": []
}
},
@@ -7304,6 +7420,15 @@
"aws-us-gov": []
}
},
"route53-recovery-readiness": {
"regions": {
"aws": [
"us-west-2"
],
"aws-cn": [],
"aws-us-gov": []
}
},
"route53domains": {
"regions": {
"aws": [
@@ -7326,7 +7451,7 @@
"us-east-2",
"ap-southeast-3",
"eu-central-1",
"eu-west-2",
"eu-central-2",
"eu-west-3",
"me-central-1",
"me-south-1",
@@ -7338,6 +7463,7 @@
"ca-central-1",
"eu-north-1",
"eu-west-1",
"eu-west-2",
"us-west-1"
],
"aws-cn": [
@@ -7485,9 +7611,9 @@
"us-west-1",
"us-west-2",
"af-south-1",
"ap-northeast-2",
"ap-southeast-3",
"eu-central-1",
"eu-central-2",
"eu-south-1",
"eu-west-1",
"eu-west-3",
@@ -7495,6 +7621,7 @@
"me-south-1",
"us-east-2",
"ap-east-1",
"ap-northeast-2",
"ap-south-1",
"ap-southeast-1",
"ca-central-1",
@@ -7553,9 +7680,40 @@
},
"sagemaker-metrics": {
"regions": {
"aws": [],
"aws-cn": [],
"aws-us-gov": []
"aws": [
"af-south-1",
"ap-east-1",
"ap-northeast-3",
"ap-southeast-1",
"ap-southeast-2",
"eu-south-2",
"sa-east-1",
"us-west-1",
"ap-northeast-1",
"ap-south-1",
"ap-southeast-3",
"ca-central-1",
"eu-central-1",
"eu-central-2",
"eu-west-1",
"me-south-1",
"us-east-1",
"ap-northeast-2",
"eu-north-1",
"eu-south-1",
"eu-west-2",
"me-central-1",
"us-east-2",
"us-west-2"
],
"aws-cn": [
"cn-north-1",
"cn-northwest-1"
],
"aws-us-gov": [
"us-gov-west-1",
"us-gov-east-1"
]
}
},
"sagemaker-runtime": {
@@ -7571,15 +7729,17 @@
"us-east-2",
"af-south-1",
"ap-northeast-1",
"ap-south-1",
"ap-southeast-1",
"ap-southeast-3",
"ca-central-1",
"eu-south-2",
"eu-west-2",
"me-central-1",
"ap-northeast-2",
"ap-northeast-3",
"ap-south-1",
"ap-southeast-2",
"eu-central-2",
"eu-north-1",
"sa-east-1",
"us-west-1",
@@ -7727,26 +7887,29 @@
"aws": [
"ap-east-1",
"ap-northeast-2",
"ap-south-2",
"ap-southeast-1",
"ap-southeast-2",
"ca-central-1",
"eu-north-1",
"eu-west-2",
"us-east-2",
"us-west-2",
"af-south-1",
"ap-northeast-3",
"ap-southeast-3",
"eu-north-1",
"eu-south-1",
"eu-west-1",
"eu-west-3",
"me-central-1",
"me-south-1",
"sa-east-1",
"us-east-1",
"ap-northeast-1",
"ap-south-1",
"eu-central-1",
"eu-central-2",
"eu-south-2",
"me-south-1",
"us-west-1"
],
"aws-cn": [
@@ -7763,9 +7926,12 @@
"regions": {
"aws": [
"ap-northeast-1",
"ap-southeast-1",
"ap-southeast-2",
"eu-central-1",
"eu-west-1",
"eu-west-2",
"sa-east-1",
"us-east-1",
"us-east-2",
"us-west-2"
@@ -7856,16 +8022,20 @@
"af-south-1",
"ap-east-1",
"ap-northeast-3",
"ap-southeast-1",
"ap-south-2",
"ap-southeast-2",
"ap-southeast-3",
"eu-central-1",
"me-central-1",
"me-south-1",
"sa-east-1",
"ap-northeast-1",
"ap-northeast-2",
"ap-south-1",
"ap-southeast-1",
"eu-central-1",
"eu-central-2",
"eu-south-1",
"eu-south-2",
"eu-west-1",
"us-west-1"
],
@@ -7919,9 +8089,9 @@
"ap-northeast-1",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-4",
"eu-south-1",
"eu-south-2",
"eu-west-2",
"eu-west-3",
"us-west-1",
"af-south-1",
@@ -7929,7 +8099,7 @@
"ap-southeast-3",
"ca-central-1",
"eu-central-1",
"eu-north-1",
"eu-west-2",
"us-east-2",
"us-west-2",
"ap-east-1",
@@ -7937,11 +8107,12 @@
"ap-south-1",
"ap-south-2",
"eu-central-2",
"eu-north-1",
"eu-west-1",
"me-central-1",
"me-south-1",
"sa-east-1",
"us-east-1"
"us-east-1",
"me-central-1"
],
"aws-cn": [
"cn-north-1",
@@ -8494,22 +8665,24 @@
"me-central-1",
"us-west-2",
"af-south-1",
"ap-northeast-2",
"ap-northeast-3",
"ap-south-2",
"ap-southeast-1",
"ca-central-1",
"eu-central-2",
"eu-west-2",
"us-west-1",
"ap-northeast-2",
"ap-south-1",
"ap-southeast-2",
"ap-southeast-4",
"eu-central-1",
"eu-north-1",
"eu-south-1",
"me-south-1",
"sa-east-1",
"us-east-1",
"us-east-2"
"us-east-2",
"eu-south-1"
],
"aws-cn": [
"cn-northwest-1",
@@ -8829,8 +9002,7 @@
"eu-west-2",
"me-south-1",
"us-east-1",
"us-east-2",
"us-west-2"
"us-east-2"
],
"aws-cn": [],
"aws-us-gov": []
@@ -8851,14 +9023,15 @@
"us-west-1",
"ap-northeast-2",
"ap-northeast-3",
"eu-north-1",
"eu-west-2",
"eu-west-3",
"me-central-1",
"me-south-1",
"us-east-1",
"us-west-2",
"ap-northeast-1",
"ca-central-1",
"eu-north-1",
"eu-south-1",
"eu-west-1"
],
@@ -8889,12 +9062,13 @@
"ap-south-1",
"ap-southeast-1",
"ca-central-1",
"eu-north-1",
"eu-central-2",
"eu-west-1",
"us-east-2",
"us-west-1",
"ap-southeast-2",
"ap-southeast-3",
"eu-north-1",
"eu-south-1",
"me-south-1",
"us-west-2"
@@ -8985,19 +9159,20 @@
"eu-central-1",
"eu-south-1",
"eu-west-2",
"me-south-1",
"us-east-1",
"us-east-2",
"us-west-2",
"af-south-1",
"ap-east-1",
"ap-northeast-1",
"ap-northeast-3",
"ap-south-1",
"ap-southeast-2",
"eu-north-1",
"eu-west-1",
"eu-west-3",
"sa-east-1",
"us-west-2",
"ap-southeast-2",
"us-west-1"
],
"aws-cn": [],

View File

@@ -1,7 +1,9 @@
import csv
import json
from copy import deepcopy
from alive_progress import alive_bar
from botocore.client import ClientError
from colorama import Fore, Style
from tabulate import tabulate
@@ -16,10 +18,10 @@ from prowler.providers.aws.lib.audit_info.models import AWS_Audit_Info
def quick_inventory(audit_info: AWS_Audit_Info, output_directory: str):
print(
f"-=- Running Quick Inventory for AWS Account {Fore.YELLOW}{audit_info.audited_account}{Style.RESET_ALL} -=-\n"
)
resources = []
global_resources = []
total_resources_per_region = {}
iam_was_scanned = False
# If not inputed regions, check all of them
if not audit_info.audited_regions:
# EC2 client for describing all regions
@@ -40,43 +42,20 @@ def quick_inventory(audit_info: AWS_Audit_Info, output_directory: str):
enrich_print=False,
) as bar:
for region in sorted(audit_info.audited_regions):
bar.title = f"-> Scanning {orange_color}{region}{Style.RESET_ALL} region"
bar.title = f"Inventorying AWS Account {orange_color}{audit_info.audited_account}{Style.RESET_ALL}"
resources_in_region = []
# {
# eu-west-1: 100,...
# }
try:
# If us-east-1 get IAM resources from there otherwise see if it is US GovCloud or China
iam_client = audit_info.audit_session.client("iam")
if (
region == "us-east-1"
or region == "us-gov-west-1"
or region == "cn-north-1"
):
# Scan IAM only once
if not iam_was_scanned:
global_resources.extend(get_iam_resources(audit_info.audit_session))
iam_was_scanned = True
get_roles_paginator = iam_client.get_paginator("list_roles")
for page in get_roles_paginator.paginate():
for role in page["Roles"]:
# Avoid aws-service-role roles
if "aws-service-role" not in role["Arn"]:
resources_in_region.append(role["Arn"])
get_users_paginator = iam_client.get_paginator("list_users")
for page in get_users_paginator.paginate():
for user in page["Users"]:
resources_in_region.append(user["Arn"])
get_groups_paginator = iam_client.get_paginator("list_groups")
for page in get_groups_paginator.paginate():
for group in page["Groups"]:
resources_in_region.append(group["Arn"])
get_policies_paginator = iam_client.get_paginator("list_policies")
for page in get_policies_paginator.paginate(Scope="Local"):
for policy in page["Policies"]:
resources_in_region.append(policy["Arn"])
for saml_provider in iam_client.list_saml_providers()[
"SAMLProviderList"
]:
resources_in_region.append(saml_provider["Arn"])
# Get regional S3 buckets since none-tagged buckets are not supported by the resourcegroupstaggingapi
resources_in_region.extend(get_regional_buckets(audit_info, region))
client = audit_info.audit_session.client(
"resourcegroupstaggingapi", region_name=region
@@ -87,12 +66,27 @@ def quick_inventory(audit_info: AWS_Audit_Info, output_directory: str):
for page in get_resources_paginator.paginate():
resources_count += len(page["ResourceTagMappingList"])
for resource in page["ResourceTagMappingList"]:
resources_in_region.append(resource["ResourceARN"])
# Avoid adding S3 buckets again:
if resource["ResourceARN"].split(":")[2] != "s3":
# Check if region is not in ARN --> Global service
if not resource["ResourceARN"].split(":")[3]:
global_resources.append(
{
"arn": resource["ResourceARN"],
"tags": resource["Tags"],
}
)
else:
resources_in_region.append(
{
"arn": resource["ResourceARN"],
"tags": resource["Tags"],
}
)
bar()
print(
f"Found {Fore.GREEN}{len(resources_in_region)}{Style.RESET_ALL} resources in region {Fore.YELLOW}{region}{Style.RESET_ALL}"
)
print("\n")
if len(resources_in_region) > 0:
total_resources_per_region[region] = len(resources_in_region)
bar.text = f"-> Found {Fore.GREEN}{len(resources_in_region)}{Style.RESET_ALL} resources in {region}"
except Exception as error:
logger.error(
@@ -103,21 +97,26 @@ def quick_inventory(audit_info: AWS_Audit_Info, output_directory: str):
resources.extend(resources_in_region)
bar.title = f"-> {Fore.GREEN}Quick Inventory completed!{Style.RESET_ALL}"
inventory_table = create_inventory_table(resources)
resources.extend(global_resources)
total_resources_per_region["global"] = len(global_resources)
inventory_table = create_inventory_table(resources, total_resources_per_region)
print(
f"\nQuick Inventory of AWS Account {Fore.YELLOW}{audit_info.audited_account}{Style.RESET_ALL}:"
)
print(tabulate(inventory_table, headers="keys", tablefmt="rounded_grid"))
print(
tabulate(
inventory_table, headers="keys", tablefmt="rounded_grid", stralign="left"
)
)
print(f"\nTotal resources found: {Fore.GREEN}{len(resources)}{Style.RESET_ALL}")
create_output(resources, audit_info, output_directory)
def create_inventory_table(resources: list) -> dict:
def create_inventory_table(resources: list, resources_in_region: dict) -> dict:
regions_with_resources = list(resources_in_region.keys())
services = {}
# { "S3":
# 123,
@@ -126,13 +125,30 @@ def create_inventory_table(resources: list) -> dict:
# }
resources_type = {}
# { "S3":
# "Buckets": 13,
# "Buckets":
# eu-west-1: 10,
# eu-west-2: 3,
# "IAM":
# "Roles": 143,
# "Users": 22,
# "Roles":
# us-east-1: 143,
# "Users":
# us-west-2: 22,
# }
for resource in sorted(resources):
service = resource.split(":")[2]
inventory_table = {
"Service": [],
f"Total\n({Fore.GREEN}{str(len(resources))}{Style.RESET_ALL})": [],
"Total per\nresource type": [],
}
for region, count in resources_in_region.items():
inventory_table[f"{region}\n({Fore.GREEN}{str(count)}{Style.RESET_ALL})"] = []
for resource in sorted(resources, key=lambda d: d["arn"]):
service = resource["arn"].split(":")[2]
region = resource["arn"].split(":")[3]
if not region:
region = "global"
if service not in services:
services[service] = 0
services[service] += 1
@@ -143,62 +159,98 @@ def create_inventory_table(resources: list) -> dict:
resource_type = "topic"
elif service == "sqs":
resource_type = "queue"
elif service == "apigateway":
split_parts = resource["arn"].split(":")[5].split("/")
if "integration" in split_parts and "responses" in split_parts:
resource_type = "restapis-resources-methods-integration-response"
elif "documentation" in split_parts and "parts" in split_parts:
resource_type = "restapis-documentation-parts"
else:
resource_type = resource["arn"].split(":")[5].split("/")[1]
else:
resource_type = resource.split(":")[5].split("/")[0]
resource_type = resource["arn"].split(":")[5].split("/")[0]
if service not in resources_type:
resources_type[service] = {}
if resource_type not in resources_type[service]:
resources_type[service][resource_type] = 0
resources_type[service][resource_type] += 1
resources_type[service][resource_type] = {}
if region not in resources_type[service][resource_type]:
resources_type[service][resource_type][region] = 0
resources_type[service][resource_type][region] += 1
# Add results to inventory table
inventory_table = {
"Service": [],
"Total": [],
"Count per resource types": [],
}
for service in services:
pending_regions = deepcopy(regions_with_resources)
aux = {}
# {
# "region": summary,
# }
summary = ""
inventory_table["Service"].append(f"{service}")
inventory_table["Total"].append(
f"{Fore.GREEN}{services[service]}{Style.RESET_ALL}"
)
for resource_type in resources_type[service]:
summary += f"{resource_type} {Fore.GREEN}{resources_type[service][resource_type]}{Style.RESET_ALL}\n"
inventory_table["Count per resource types"].append(summary)
inventory_table[
f"Total\n({Fore.GREEN}{str(len(resources))}{Style.RESET_ALL})"
].append(f"{Fore.GREEN}{services[service]}{Style.RESET_ALL}")
for resource_type, regions in resources_type[service].items():
summary += f"{resource_type} {Fore.GREEN}{str(sum(regions.values()))}{Style.RESET_ALL}\n"
# Check if region does not have resource type
for region in pending_regions:
if region not in aux:
aux[region] = ""
if region not in regions:
aux[region] += "-\n"
for region, count in regions.items():
aux[region] += f"{Fore.GREEN}{str(count)}{Style.RESET_ALL}\n"
# Add Total per resource type
inventory_table["Total per\nresource type"].append(summary)
# Add Total per region
for region, text in aux.items():
inventory_table[
f"{region}\n({Fore.GREEN}{str(resources_in_region[region])}{Style.RESET_ALL})"
].append(text)
if region in pending_regions:
pending_regions.remove(region)
for region_without_resource in pending_regions:
inventory_table[
f"{region_without_resource}\n ({Fore.GREEN}{str(resources_in_region[region_without_resource])}{Style.RESET_ALL})"
].append("-")
return inventory_table
def create_output(resources: list, audit_info: AWS_Audit_Info, output_directory: str):
json_output = []
output_file = f"{output_directory}/prowler-inventory-{audit_info.audited_account}-{output_file_timestamp}"
for item in sorted(resources):
for item in sorted(resources, key=lambda d: d["arn"]):
resource = {}
resource["AWS_AccountID"] = audit_info.audited_account
resource["AWS_Region"] = item.split(":")[3]
resource["AWS_Partition"] = item.split(":")[1]
resource["AWS_Service"] = item.split(":")[2]
resource["AWS_ResourceType"] = item.split(":")[5].split("/")[0]
resource["AWS_Region"] = item["arn"].split(":")[3]
resource["AWS_Partition"] = item["arn"].split(":")[1]
resource["AWS_Service"] = item["arn"].split(":")[2]
resource["AWS_ResourceType"] = item["arn"].split(":")[5].split("/")[0]
resource["AWS_ResourceID"] = ""
if len(item.split("/")) > 1:
resource["AWS_ResourceID"] = item.split("/")[-1]
elif len(item.split(":")) > 6:
resource["AWS_ResourceID"] = item.split(":")[-1]
resource["AWS_ResourceARN"] = item
if len(item["arn"].split("/")) > 1:
resource["AWS_ResourceID"] = item["arn"].split("/")[-1]
elif len(item["arn"].split(":")) > 6:
resource["AWS_ResourceID"] = item["arn"].split(":")[-1]
resource["AWS_ResourceARN"] = item["arn"]
# Cover S3 case
if resource["AWS_Service"] == "s3":
resource["AWS_ResourceType"] = "bucket"
resource["AWS_ResourceID"] = item.split(":")[-1]
resource["AWS_ResourceID"] = item["arn"].split(":")[-1]
# Cover WAFv2 case
if resource["AWS_Service"] == "wafv2":
resource["AWS_ResourceType"] = "/".join(item.split(":")[-1].split("/")[:-2])
resource["AWS_ResourceID"] = "/".join(item.split(":")[-1].split("/")[2:])
resource["AWS_ResourceType"] = "/".join(
item["arn"].split(":")[-1].split("/")[:-2]
)
resource["AWS_ResourceID"] = "/".join(
item["arn"].split(":")[-1].split("/")[2:]
)
# Cover Config case
if resource["AWS_Service"] == "config":
resource["AWS_ResourceID"] = "/".join(item.split(":")[-1].split("/")[1:])
resource["AWS_ResourceID"] = "/".join(
item["arn"].split(":")[-1].split("/")[1:]
)
resource["AWS_Tags"] = item["tags"]
json_output.append(resource)
# Serializing json
@@ -224,3 +276,64 @@ def create_output(resources: list, audit_info: AWS_Audit_Info, output_directory:
print("\nMore details in files:")
print(f" - CSV: {output_file+csv_file_suffix}")
print(f" - JSON: {output_file+json_file_suffix}")
def get_regional_buckets(audit_info: AWS_Audit_Info, region: str) -> list:
regional_buckets = []
s3_client = audit_info.audit_session.client("s3", region_name=region)
buckets = s3_client.list_buckets()
for bucket in buckets["Buckets"]:
bucket_region = s3_client.get_bucket_location(Bucket=bucket["Name"])[
"LocationConstraint"
]
if bucket_region == "EU": # If EU, bucket_region is eu-west-1
bucket_region = "eu-west-1"
if not bucket_region: # If None, bucket_region is us-east-1
bucket_region = "us-east-1"
if bucket_region == region: # Only add bucket if is in current region
try:
bucket_tags = s3_client.get_bucket_tagging(Bucket=bucket["Name"])[
"TagSet"
]
except ClientError as error:
bucket_tags = []
if error.response["Error"]["Code"] != "NoSuchTagSet":
logger.error(
f"{region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
bucket_arn = (
f"arn:{audit_info.audited_partition}:s3:{region}::{bucket['Name']}"
)
regional_buckets.append({"arn": bucket_arn, "tags": bucket_tags})
return regional_buckets
def get_iam_resources(session) -> list:
iam_resources = []
iam_client = session.client("iam")
get_roles_paginator = iam_client.get_paginator("list_roles")
for page in get_roles_paginator.paginate():
for role in page["Roles"]:
# Avoid aws-service-role roles
if "aws-service-role" not in role["Arn"]:
iam_resources.append({"arn": role["Arn"], "tags": role.get("Tags")})
get_users_paginator = iam_client.get_paginator("list_users")
for page in get_users_paginator.paginate():
for user in page["Users"]:
iam_resources.append({"arn": user["Arn"], "tags": user.get("Tags")})
get_groups_paginator = iam_client.get_paginator("list_groups")
for page in get_groups_paginator.paginate():
for group in page["Groups"]:
iam_resources.append({"arn": group["Arn"], "tags": []})
get_policies_paginator = iam_client.get_paginator("list_policies")
for page in get_policies_paginator.paginate(Scope="Local"):
for policy in page["Policies"]:
iam_resources.append({"arn": policy["Arn"], "tags": policy.get("Tags")})
for saml_provider in iam_client.list_saml_providers()["SAMLProviderList"]:
iam_resources.append({"arn": saml_provider["Arn"], "tags": []})
return iam_resources

View File

@@ -0,0 +1,38 @@
import sys
from prowler.lib.logger import logger
from prowler.providers.aws.aws_provider import generate_regional_clients
from prowler.providers.aws.lib.audit_info.models import AWS_Audit_Info
def get_tagged_resources(input_resource_tags: list, current_audit_info: AWS_Audit_Info):
"""
get_tagged_resources returns a list of the resources that are going to be scanned based on the given input tags
"""
try:
resource_tags = []
tagged_resources = []
for tag in input_resource_tags:
key = tag.split("=")[0]
value = tag.split("=")[1]
resource_tags.append({"Key": key, "Values": [value]})
# Get Resources with resource_tags for all regions
for regional_client in generate_regional_clients(
"resourcegroupstaggingapi", current_audit_info
).values():
try:
get_resources_paginator = regional_client.get_paginator("get_resources")
for page in get_resources_paginator.paginate(TagFilters=resource_tags):
for resource in page["ResourceTagMappingList"]:
tagged_resources.append(resource["ResourceARN"])
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.critical(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
sys.exit(1)
else:
return tagged_resources

View File

@@ -26,10 +26,6 @@
}
},
"Categories": [],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -17,6 +17,7 @@ class accessanalyzer_enabled(Check):
)
report.resource_id = analyzer.name
report.resource_arn = analyzer.arn
report.resource_tags = analyzer.tags
elif analyzer.status == "NOT_AVAILABLE":
report.status = "FAIL"
@@ -31,6 +32,7 @@ class accessanalyzer_enabled(Check):
)
report.resource_id = analyzer.name
report.resource_arn = analyzer.arn
report.resource_tags = analyzer.tags
findings.append(report)
return findings

View File

@@ -26,10 +26,6 @@
}
},
"Categories": [],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -17,6 +17,7 @@ class accessanalyzer_enabled_without_findings(Check):
)
report.resource_id = analyzer.name
report.resource_arn = analyzer.arn
report.resource_tags = analyzer.tags
if len(analyzer.findings) != 0:
active_finding_counter = 0
for finding in analyzer.findings:
@@ -28,6 +29,7 @@ class accessanalyzer_enabled_without_findings(Check):
report.status_extended = f"IAM Access Analyzer {analyzer.name} has {active_finding_counter} active findings"
report.resource_id = analyzer.name
report.resource_arn = analyzer.arn
report.resource_tags = analyzer.tags
elif analyzer.status == "NOT_AVAILABLE":
report.status = "FAIL"
report.status_extended = (
@@ -41,6 +43,7 @@ class accessanalyzer_enabled_without_findings(Check):
)
report.resource_id = analyzer.name
report.resource_arn = analyzer.arn
report.resource_tags = analyzer.tags
findings.append(report)
return findings

View File

@@ -1,4 +1,5 @@
import threading
from typing import Optional
from pydantic import BaseModel
@@ -48,7 +49,7 @@ class AccessAnalyzer:
arn=analyzer["arn"],
name=analyzer["name"],
status=analyzer["status"],
tags=str(analyzer["tags"]),
tags=[analyzer.get("tags")],
type=analyzer["type"],
region=regional_client.region,
)
@@ -60,7 +61,7 @@ class AccessAnalyzer:
arn="",
name=self.audited_account,
status="NOT_AVAILABLE",
tags="",
tags=[],
type="",
region=regional_client.region,
)
@@ -119,6 +120,6 @@ class Analyzer(BaseModel):
name: str
status: str
findings: list[Finding] = []
tags: str
tags: Optional[list] = []
type: str
region: str

View File

@@ -26,10 +26,6 @@
}
},
"Categories": [],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -26,10 +26,6 @@
}
},
"Categories": [],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -26,10 +26,6 @@
}
},
"Categories": [],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -26,10 +26,6 @@
}
},
"Categories": [],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -15,11 +15,13 @@ class acm_certificates_expiration_check(Check):
report.status_extended = f"ACM Certificate for {certificate.name} expires in {certificate.expiration_days} days."
report.resource_id = certificate.name
report.resource_arn = certificate.arn
report.resource_tags = certificate.tags
else:
report.status = "FAIL"
report.status_extended = f"ACM Certificate for {certificate.name} is about to expire in {DAYS_TO_EXPIRE_THRESHOLD} days."
report.resource_id = certificate.name
report.resource_arn = certificate.arn
report.resource_tags = certificate.tags
findings.append(report)
return findings

View File

@@ -26,10 +26,6 @@
}
},
"Categories": [],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -15,16 +15,19 @@ class acm_certificates_transparency_logs_enabled(Check):
)
report.resource_id = certificate.name
report.resource_arn = certificate.arn
report.resource_tags = certificate.tags
else:
if not certificate.transparency_logging:
report.status = "FAIL"
report.status_extended = f"ACM Certificate for {certificate.name} has Certificate Transparency logging disabled."
report.resource_id = certificate.name
report.resource_arn = certificate.arn
report.resource_tags = certificate.tags
else:
report.status = "PASS"
report.status_extended = f"ACM Certificate for {certificate.name} has Certificate Transparency logging enabled."
report.resource_id = certificate.name
report.resource_arn = certificate.arn
report.resource_tags = certificate.tags
findings.append(report)
return findings

View File

@@ -1,7 +1,9 @@
import threading
from dataclasses import dataclass
from datetime import datetime
from typing import Optional
from pydantic import BaseModel
from prowler.config.config import timestamp_utc
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
from prowler.providers.aws.aws_provider import generate_regional_clients
@@ -18,6 +20,7 @@ class ACM:
self.certificates = []
self.__threading_call__(self.__list_certificates__)
self.__describe_certificates__()
self.__list_tags_for_certificate__()
def __get_session__(self):
return self.session
@@ -44,12 +47,26 @@ class ACM:
certificate["CertificateArn"], self.audit_resources
)
):
if "NotAfter" in certificate:
# We need to get the TZ info to be able to do the math
certificate_expiration_time = (
certificate["NotAfter"]
- datetime.now(
certificate["NotAfter"].tzinfo
if hasattr(certificate["NotAfter"], "tzinfo")
else None
)
).days
else:
certificate_expiration_time = 0
self.certificates.append(
Certificate(
certificate["CertificateArn"],
certificate["DomainName"],
False,
regional_client.region,
arn=certificate["CertificateArn"],
name=certificate["DomainName"],
type=certificate["Type"],
expiration_days=certificate_expiration_time,
transparency_logging=False,
region=regional_client.region,
)
)
except Exception as error:
@@ -65,13 +82,6 @@ class ACM:
response = regional_client.describe_certificate(
CertificateArn=certificate.arn
)["Certificate"]
certificate.type = response["Type"]
if "NotAfter" in response:
certificate.expiration_days = (
response["NotAfter"] - timestamp_utc
).days
else:
certificate.expiration_days = 0
if (
response["Options"]["CertificateTransparencyLoggingPreference"]
== "ENABLED"
@@ -82,24 +92,26 @@ class ACM:
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def __list_tags_for_certificate__(self):
logger.info("ACM - List Tags...")
try:
for certificate in self.certificates:
regional_client = self.regional_clients[certificate.region]
response = regional_client.list_tags_for_certificate(
CertificateArn=certificate.arn
)["Tags"]
certificate.tags = response
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
@dataclass
class Certificate:
class Certificate(BaseModel):
arn: str
name: str
type: str
tags: Optional[list] = []
expiration_days: int
transparency_logging: bool
transparency_logging: Optional[bool]
region: str
def __init__(
self,
arn,
name,
transparency_logging,
region,
):
self.arn = arn
self.name = name
self.transparency_logging = transparency_logging
self.region = region

View File

@@ -26,10 +26,6 @@
}
},
"Categories": [],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -12,6 +12,7 @@ class apigateway_authorizers_enabled(Check):
report.region = rest_api.region
report.resource_id = rest_api.name
report.resource_arn = rest_api.arn
report.resource_tags = rest_api.tags
if rest_api.authorizer:
report.status = "PASS"
report.status_extended = f"API Gateway {rest_api.name} ID {rest_api.id} has authorizer configured."

View File

@@ -26,10 +26,6 @@
}
},
"Categories": [],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -13,6 +13,7 @@ class apigateway_client_certificate_enabled(Check):
report.resource_id = rest_api.name
report.region = rest_api.region
report.resource_arn = stage.arn
report.resource_tags = stage.tags
if stage.client_certificate:
report.status = "PASS"
report.status_extended = f"API Gateway {rest_api.name} ID {rest_api.id} in stage {stage.name} has client certificate enabled."

View File

@@ -28,10 +28,6 @@
"Categories": [
"internet-exposed"
],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -12,6 +12,7 @@ class apigateway_endpoint_public(Check):
report.region = rest_api.region
report.resource_id = rest_api.name
report.resource_arn = rest_api.arn
report.resource_tags = rest_api.tags
if rest_api.public_endpoint:
report.status = "FAIL"
report.status_extended = f"API Gateway {rest_api.name} ID {rest_api.id} is internet accesible."

View File

@@ -28,10 +28,6 @@
"Categories": [
"forensics-ready"
],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -13,6 +13,7 @@ class apigateway_logging_enabled(Check):
report.region = rest_api.region
report.resource_id = rest_api.name
report.resource_arn = stage.arn
report.resource_tags = stage.tags
if stage.logging:
report.status = "PASS"
report.status_extended = f"API Gateway {rest_api.name} ID {rest_api.id} in stage {stage.name} has logging enabled."

View File

@@ -1,5 +1,7 @@
import threading
from dataclasses import dataclass
from typing import Optional
from pydantic import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -45,10 +47,11 @@ class APIGateway:
):
self.rest_apis.append(
RestAPI(
apigw["id"],
arn,
regional_client.region,
apigw["name"],
id=apigw["id"],
arn=arn,
region=regional_client.region,
name=apigw["name"],
tags=[apigw.get("tags")],
)
)
except Exception as error:
@@ -100,61 +103,33 @@ class APIGateway:
arn = f"arn:{self.audited_partition}:apigateway:{regional_client.region}::/apis/{rest_api.id}/stages/{stage['stageName']}"
rest_api.stages.append(
Stage(
stage["stageName"],
arn,
logging,
client_certificate,
waf,
name=stage["stageName"],
arn=arn,
logging=logging,
client_certificate=client_certificate,
waf=waf,
tags=[stage.get("tags")],
)
)
except Exception as error:
logger.error(f"{error.__class__.__name__}: {error}")
@dataclass
class Stage:
class Stage(BaseModel):
name: str
arn: str
logging: bool
client_certificate: bool
waf: str
def __init__(
self,
name,
arn,
logging,
client_certificate,
waf,
):
self.name = name
self.arn = arn
self.logging = logging
self.client_certificate = client_certificate
self.waf = waf
waf: Optional[str]
tags: Optional[list] = []
@dataclass
class RestAPI:
class RestAPI(BaseModel):
id: str
arn: str
region: str
name: str
authorizer: bool
public_endpoint: bool
stages: list[Stage]
def __init__(
self,
id,
arn,
region,
name,
):
self.id = id
self.arn = arn
self.region = region
self.name = name
self.authorizer = False
self.public_endpoint = True
self.stages = []
authorizer: bool = False
public_endpoint: bool = True
stages: list[Stage] = []
tags: Optional[list] = []

View File

@@ -26,10 +26,6 @@
}
},
"Categories": [],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -13,6 +13,7 @@ class apigateway_waf_acl_attached(Check):
report.region = rest_api.region
report.resource_id = rest_api.name
report.resource_arn = stage.arn
report.resource_tags = stage.tags
if stage.waf:
report.status = "PASS"
report.status_extended = f"API Gateway {rest_api.name} ID {rest_api.id} in stage {stage.name} has {stage.waf} WAF ACL attached."

View File

@@ -26,10 +26,6 @@
}
},
"Categories": [],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -15,10 +15,12 @@ class apigatewayv2_access_logging_enabled(Check):
report.status = "PASS"
report.status_extended = f"API Gateway V2 {api.name} ID {api.id} in stage {stage.name} has access logging enabled."
report.resource_id = api.name
report.resource_tags = api.tags
else:
report.status = "FAIL"
report.status_extended = f"API Gateway V2 {api.name} ID {api.id} in stage {stage.name} has access logging disabled."
report.resource_id = api.name
report.resource_tags = api.tags
findings.append(report)
return findings

View File

@@ -26,10 +26,6 @@
}
},
"Categories": [],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -16,10 +16,12 @@ class apigatewayv2_authorizers_enabled(Check):
f"API Gateway V2 {api.name} ID {api.id} has authorizer configured."
)
report.resource_id = api.name
report.resource_tags = api.tags
else:
report.status = "FAIL"
report.status_extended = f"API Gateway V2 {api.name} ID {api.id} has not authorizer configured."
report.resource_id = api.name
report.resource_tags = api.tags
findings.append(report)
return findings

View File

@@ -1,5 +1,7 @@
import threading
from dataclasses import dataclass
from typing import Optional
from pydantic import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -42,9 +44,10 @@ class ApiGatewayV2:
):
self.apis.append(
API(
apigw["ApiId"],
regional_client.region,
apigw["Name"],
id=apigw["ApiId"],
region=regional_client.region,
name=apigw["Name"],
tags=[apigw.get("Tags")],
)
)
except Exception as error:
@@ -77,8 +80,9 @@ class ApiGatewayV2:
logging = True
api.stages.append(
Stage(
stage["StageName"],
logging,
name=stage["StageName"],
logging=logging,
tags=[stage.get("Tags")],
)
)
except Exception as error:
@@ -87,36 +91,16 @@ class ApiGatewayV2:
)
@dataclass
class Stage:
class Stage(BaseModel):
name: str
logging: bool
def __init__(
self,
name,
logging,
):
self.name = name
self.logging = logging
tags: Optional[list] = []
@dataclass
class API:
class API(BaseModel):
id: str
region: str
name: str
authorizer: bool
stages: list[Stage]
def __init__(
self,
id,
region,
name,
):
self.id = id
self.region = region
self.name = name
self.authorizer = False
self.stages = []
authorizer: bool = False
stages: list[Stage] = []
tags: Optional[list] = []

View File

@@ -28,10 +28,6 @@
}
},
"Categories": [],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": "Infrastructure Security"

View File

@@ -14,6 +14,7 @@ class appstream_fleet_default_internet_access_disabled(Check):
report.region = fleet.region
report.resource_id = fleet.name
report.resource_arn = fleet.arn
report.resource_tags = fleet.tags
if fleet.enable_default_internet_access:
report.status = "FAIL"

View File

@@ -26,10 +26,6 @@
}
},
"Categories": [],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": "Infrastructure Security"

View File

@@ -17,6 +17,7 @@ class appstream_fleet_maximum_session_duration(Check):
report.region = fleet.region
report.resource_id = fleet.name
report.resource_arn = fleet.arn
report.resource_tags = fleet.tags
if fleet.max_user_duration_in_seconds < max_session_duration_seconds:
report.status = "PASS"

View File

@@ -28,10 +28,6 @@
}
},
"Categories": [],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": "Infrastructure Security"

View File

@@ -17,6 +17,7 @@ class appstream_fleet_session_disconnect_timeout(Check):
report.region = fleet.region
report.resource_id = fleet.name
report.resource_arn = fleet.arn
report.resource_tags = fleet.tags
if fleet.disconnect_timeout_in_seconds <= max_disconnect_timeout_in_seconds:
report.status = "PASS"

View File

@@ -28,10 +28,6 @@
}
},
"Categories": [],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": "Infrastructure Security"

View File

@@ -19,6 +19,7 @@ class appstream_fleet_session_idle_disconnect_timeout(Check):
report.region = fleet.region
report.resource_id = fleet.name
report.resource_arn = fleet.arn
report.resource_tags = fleet.tags
if (
fleet.idle_disconnect_timeout_in_seconds

View File

@@ -18,6 +18,7 @@ class AppStream:
self.regional_clients = generate_regional_clients(self.service, audit_info)
self.fleets = []
self.__threading_call__(self.__describe_fleets__)
self.__list_tags_for_resource__()
def __get_session__(self):
return self.session
@@ -65,6 +66,20 @@ class AppStream:
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def __list_tags_for_resource__(self):
logger.info("AppStream - List Tags...")
try:
for fleet in self.fleets:
regional_client = self.regional_clients[fleet.region]
response = regional_client.list_tags_for_resource(
ResourceArn=fleet.arn
)["Tags"]
fleet.tags = [response]
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
class Fleet(BaseModel):
arn: str
@@ -74,3 +89,4 @@ class Fleet(BaseModel):
idle_disconnect_timeout_in_seconds: Optional[int]
enable_default_internet_access: bool
region: str
tags: Optional[list] = []

View File

@@ -28,10 +28,6 @@
"Categories": [
"secrets"
],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -1,5 +1,6 @@
import threading
from dataclasses import dataclass
from pydantic import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -45,11 +46,11 @@ class AutoScaling:
):
self.launch_configurations.append(
LaunchConfiguration(
configuration["LaunchConfigurationARN"],
configuration["LaunchConfigurationName"],
configuration["UserData"],
configuration["ImageId"],
regional_client.region,
arn=configuration["LaunchConfigurationARN"],
name=configuration["LaunchConfigurationName"],
user_data=configuration["UserData"],
image_id=configuration["ImageId"],
region=regional_client.region,
)
)
@@ -59,24 +60,9 @@ class AutoScaling:
)
@dataclass
class LaunchConfiguration:
class LaunchConfiguration(BaseModel):
arn: str
name: str
user_data: str
image_id: int
image_id: str
region: str
def __init__(
self,
arn,
name,
user_data,
image_id,
region,
):
self.arn = arn
self.name = name
self.image_id = image_id
self.user_data = user_data
self.region = region

View File

@@ -26,10 +26,6 @@
"Categories": [
"forensics-ready"
],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -13,6 +13,7 @@ class awslambda_function_invoke_api_operations_cloudtrail_logging_enabled(Check)
report.region = function.region
report.resource_id = function.name
report.resource_arn = function.arn
report.resource_tags = function.tags
report.status = "FAIL"
report.status_extended = (
@@ -21,20 +22,31 @@ class awslambda_function_invoke_api_operations_cloudtrail_logging_enabled(Check)
lambda_recorded_cloudtrail = False
for trail in cloudtrail_client.trails:
for data_event in trail.data_events:
if "DataResources" in data_event.event_selector:
for resource in data_event.event_selector["DataResources"]:
# classic event selectors
if not data_event.is_advanced:
if "DataResources" in data_event.event_selector:
for resource in data_event.event_selector["DataResources"]:
if resource["Type"] == "AWS::Lambda::Function" and (
function.arn in resource["Values"]
or "arn:aws:lambda" in resource["Values"]
):
lambda_recorded_cloudtrail = True
break
elif data_event.is_advanced:
for field_selector in data_event.event_selector[
"FieldSelectors"
]:
if (
resource["Type"] == "AWS::Lambda::Function"
and function.arn in resource["Values"]
field_selector["Field"] == "resources.type"
and "AWS::Lambda::Function" in field_selector["Equals"]
):
lambda_recorded_cloudtrail = True
break
if lambda_recorded_cloudtrail:
break
if lambda_recorded_cloudtrail:
report.status = "PASS"
report.status_extended = f"Lambda function {function.name} is recorded by CloudTrail {trail.name}"
report.status_extended = f"Lambda function {function.name} is recorded by CloudTrail trail {trail.name}"
break
findings.append(report)

View File

@@ -26,10 +26,6 @@
"Categories": [
"secrets"
],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -17,6 +17,7 @@ class awslambda_function_no_secrets_in_code(Check):
report.region = function.region
report.resource_id = function.name
report.resource_arn = function.arn
report.resource_tags = function.tags
report.status = "PASS"
report.status_extended = (
@@ -26,15 +27,40 @@ class awslambda_function_no_secrets_in_code(Check):
function.code.code_zip.extractall(tmp_dir_name)
# List all files
files_in_zip = next(os.walk(tmp_dir_name))[2]
secrets_findings = []
for file in files_in_zip:
secrets = SecretsCollection()
with default_settings():
secrets.scan_file(f"{tmp_dir_name}/{file}")
detect_secrets_output = secrets.json()
if detect_secrets_output:
for (
file_name
) in (
detect_secrets_output.keys()
): # Appears that only 1 file is being scanned at a time, so could rework this
output_file_name = file_name.replace(
f"{tmp_dir_name}/", ""
)
secrets_string = ", ".join(
[
f"{secret['type']} on line {secret['line_number']}"
for secret in detect_secrets_output[file_name]
]
)
secrets_findings.append(
f"{output_file_name}: {secrets_string}"
)
if secrets.json():
report.status = "FAIL"
report.status_extended = f"Potential secret found in Lambda function {function.name} code"
break
if secrets_findings:
final_output_string = "; ".join(secrets_findings)
report.status = "FAIL"
# report.status_extended = f"Potential {'secrets' if len(secrets_findings)>1 else 'secret'} found in Lambda function {function.name} code. {final_output_string}"
if len(secrets_findings) > 1:
report.status_extended = f"Potential secrets found in Lambda function {function.name} code -> {final_output_string}"
else:
report.status_extended = f"Potential secret found in Lambda function {function.name} code -> {final_output_string}"
# break // Don't break as there may be additional findings
findings.append(report)

View File

@@ -26,10 +26,6 @@
"Categories": [
"secrets"
],
"Tags": {
"Tag1Key": "value",
"Tag2Key": "value"
},
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

Some files were not shown because too many files have changed in this diff Show More