Compare commits

..

106 Commits

Author SHA1 Message Date
Sergio Garcia
e9b09790da feat(release): 2.12.1 2022-12-19 17:59:04 +01:00
Acknosyn
c74b4adf27 fix(): Fix CloudTrail trail S3 logging public bucket false positive result when trail bucket doesn't exist (#1505)
Co-authored-by: Francesco Badraun <francesco.badraun@zxsecurity.co.nz>
2022-12-14 16:32:38 +01:00
Kay Agahd
a769bb86d3 fix(check_extra723): Corrected some typos (#1511) 2022-11-22 08:55:38 +01:00
Nacho Rivera
f8a2527429 fix(README): include more details about db connector (#1507) 2022-11-21 09:38:04 +01:00
laura franzese
ae645718ad new copy pointing to prowlerpro (#1488) 2022-11-17 09:40:16 +01:00
Nacho Rivera
a0625dff2f fix(extra71): Modified wrong remediation (#1445) 2022-11-02 10:00:25 +01:00
Fennerr
37e9cbbabd fix(extra7195): Update title (#1440) 2022-10-31 14:33:25 +01:00
Pepe Fagoaga
8818f47333 fix(ecr): typo (#1438) 2022-10-27 19:47:06 +02:00
Pepe Fagoaga
3cffe72273 fix(ecr): Platform (#1437) 2022-10-27 19:30:59 +02:00
Nacho Rivera
135aaca851 fix(): delete old commented versions (#1436) 2022-10-27 16:19:33 +02:00
Nacho Rivera
cf8df051de feat(README): Include versions info (#1435)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2022-10-27 12:51:53 +02:00
Pepe Fagoaga
bef42f3f2d feat(release): Prowler 2.12.0 (#1434) 2022-10-27 12:10:37 +02:00
Nacho Rivera
3d86bf1705 fix(): Cloudtrail checks (#1433) 2022-10-27 11:47:00 +02:00
Olivier Gendron
5a43ec951a docs(spelling): Typo corrections (#1394) 2022-10-24 12:58:44 +02:00
Nacho Rivera
b0e6ab6e31 feat(stable tag): Inclusion of stable tag point to last release (#1419) 2022-10-20 08:01:00 +02:00
Nacho Rivera
b7fb38cc9e fix(extra7184): Error handling GetSnapshotLimits api call (#1411) 2022-10-17 14:03:55 +02:00
Nacho Rivera
f29f7fc239 fix(extra7183): Exception handling error UnsupportedOperationException (#1410) 2022-10-17 13:39:17 +02:00
Nacho Rivera
2997ff0f1c fix(extra77): Deleted resource id from exception results (#1409) 2022-10-17 13:17:51 +02:00
Nacho Rivera
11dc0aa5b2 feat(extra7111): Exception handling (#1408) 2022-10-17 12:51:09 +02:00
Sergio Garcia
8bddb9b265 fix(extra740): Remove additional info and fix max_items (#1405) 2022-10-14 11:37:31 +02:00
Sergio Garcia
689e292585 fix(region_bugs): Remove duplicate outputs (#1390) 2022-10-13 13:18:37 +02:00
Sergio Garcia
bff2aabda6 fix(missing permissions): Add missing permissions of checks (#1403) 2022-10-13 12:59:48 +02:00
Kay Agahd
4b29293362 fix(check_extra77): Add missing check_resource_id to the report (#1402) 2022-10-13 09:53:31 +02:00
Sergio Garcia
4e24103dc6 feat(slack): add Slack badge to README (#1401) 2022-10-13 09:42:06 +02:00
Sergio Garcia
3b90347849 fix(inventory): quick inventory input fixed (#1397)
Co-authored-by: sergargar <sergio@verica.io>
2022-10-10 17:21:46 +02:00
Pepe Fagoaga
6a7a037cec delete(shortcut.sh): Remove ScoutSuite (#1388) 2022-10-06 16:42:09 +02:00
Gábor Lipták
927c13b9c6 chore(actions): Bump Trufflehog to v3.13.0 (#1382) 2022-10-06 09:24:54 +02:00
Nacho Rivera
11cc8e998b fix(checks): Handle checks not returning result (#1383)
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2022-10-05 13:50:49 +02:00
Nacho Rivera
4a71739c56 Prwlr 879 fix prowler 2 x checks (#1380)
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2022-10-03 10:19:02 +02:00
Pepe Fagoaga
aedc5cd0ad fix(postgresql): Missing space (#1374) 2022-09-22 15:25:32 +02:00
Pepe Fagoaga
3d81307e56 fix(postgresql): Connector field (#1372) 2022-09-20 10:26:20 +02:00
Andrew Walker
918661bd7a Dockerfile build instructions (#1370) 2022-09-16 11:14:37 +02:00
Nacho Rivera
99f9abe3f6 feat(db-connector): Include UUID for findings ID (#1368) 2022-09-14 17:23:38 +02:00
Sergio Garcia
f2950764f0 feat(audit_id): add optional audit_id field to postgres connector (#1362)
Co-authored-by: sergargar <sergio@verica.io>
2022-09-13 13:29:19 +02:00
Pepe Fagoaga
d9777a68c7 chore(lint&test): Prowler 3.0 (#1357) 2022-09-01 16:37:10 +02:00
Richard Carpenter
2a4cc9a5f8 feat(group): CIS Critical Security Controls v8 (#1347)
Co-authored-by: sergargar <sergio@verica.io>
2022-08-31 15:14:04 +02:00
Ignacio Dominguez
1f0c210926 feat(extra7195): Added check for dependency confusion in codeartifact (#1329)
Co-authored-by: sergargar <sergio@verica.io>
2022-08-31 09:49:50 +02:00
JArmandoG
dd64c7d226 fix(check120): correct AWS support policy name (#1328)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2022-08-23 11:34:25 +01:00
JArmandoG
865f79f5b3 fix(quick_inventory): Handle math expression (#1283) 2022-08-05 12:55:07 +02:00
Pepe Fagoaga
1f8a4c1022 fix(credential_report): Do not generate for 117 and 118 (#1322) 2022-08-05 11:03:59 +02:00
Pepe Fagoaga
1e422f20aa fix(security-groups): Include TCP as the IpProtocol (#1323) 2022-08-05 11:02:35 +02:00
Pepe Fagoaga
29eda28bf3 docs(outputs): structure (#1313) 2022-08-04 10:05:08 +02:00
Pepe Fagoaga
f67f0cc66d chore(issues): Link Q&A (#1305) 2022-08-03 12:46:51 +02:00
Pepe Fagoaga
721cafa0cd fix(appstream): Handle timeout errors (#1296) 2022-08-02 12:30:53 +02:00
Kay Agahd
c1d60054e9 feat(extra780): Check for Cognito or SAML authentication on OpenSearch (#1291)
* extend check_extra780 to check for cognito or SAML authentication on opensearch

* chore(extra780): Error handling

Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2022-08-02 09:51:38 +02:00
Pepe Fagoaga
b95b3f68d3 fix(permissions): Include missing appstream:DescribeFleets permission (#1278)
* fix(permissions): AWS AppStream

Include missing appstream:DescribeFleets permission

* fix(permissions): AWS AppStream
2022-08-02 09:47:04 +02:00
Jonathan Jenkyn
81b6e27eb8 feat(checks): Adding commands for checks 117 and 118 (#1289)
* Adding commands for checks 117 and 118

* fix(check118): Minor fixes and error handling

* fix(check117): Minor fixes and error handling

Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2022-08-02 09:18:46 +02:00
William
d69678424b fix(extra712): changed Macie service detection (#1286)
* changed Macie service detection

* fix(regions): add region context and more.

Co-authored-by: sergargar <sergio@verica.io>
2022-07-28 13:53:54 -04:00
Pepe Fagoaga
a43c1aceec fix(check12): Improve remediation (#1281) 2022-07-26 14:37:35 -04:00
Pepe Fagoaga
f70cf8d81e fix(ci): Release edited (#1276) 2022-07-21 17:44:26 +02:00
Pepe Fagoaga
83b6c79203 fix(ci): Remove check-update (#1275) 2022-07-21 17:33:28 +02:00
Andrew
1192c038b2 docs(readme): Fix spelling errors (#1274) 2022-07-21 17:06:03 +02:00
Pepe Fagoaga
4ebbf6553e chore(release): 2.11.0 (#1272) 2022-07-21 10:48:32 +02:00
r8bhavneet
c501d63382 docs(readme): Fix spelling (#1271) 2022-07-21 10:42:40 +02:00
Toni de la Fuente
72d6d3f535 feat(inventory): Prowler quick inventory including IAM resources (#1258)
* chore(inventory): option included in main

* chore(inventory): quick inventory

* chore(inventory): functional version

* chore(inventory): functional version without echo

* Update include/quick_inventory

Co-authored-by: Pepe Fagoaga <pepe@verica.io>

* Update prowler

Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>

* Added new line at report line

* Added more information from IAM

Co-authored-by: Pepe Fagoaga <pepe@verica.io>
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2022-07-21 10:37:28 +02:00
Mitch
ddd34dc9cc fix(extra7173): Correct check and alternative name (#1270) 2022-07-20 08:36:34 +02:00
Sergio Garcia
03b1c10d13 fix(codebuild): expired token error using Instance Metadata
Co-authored-by: sergargar <sergio@verica.io>
2022-07-14 07:32:01 +02:00
Sergio Garcia
4cd5b8fd04 fix(codebuild): expired token error (#1262) 2022-07-12 07:38:44 +02:00
Phil Massyn
f0ce17182b feat(ecr_lifecycle): Check Lifecycle policy (#1260)
* Create checks_7194

ECR Repositories contain docker containers.  When automated processes create containers, the old ones tend to take up space.  With a lot of containers on the system, the account owner will be paying additional fees for images that are no longer in use.  By defining a lifecycle policy, a best practice is followed by reducing the total volume of data being consumed.

* Minor changes

* fix: Include bash header

Co-authored-by: n4ch04 <nachor1992@gmail.com>
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2022-07-11 13:03:31 +02:00
Sergio Garcia
2a8a7d844b fix(apigatewayv2): handle BadRequestException (#1261)
Co-authored-by: sergargar <sergio@verica.io>
2022-07-11 12:21:39 +02:00
Pepe Fagoaga
ff33f426e5 docs(readme): Update inventory and checks (#1257)
* docs(readme): Update inventory and checks

* docs(readme): inventory path

Co-authored-by: Toni de la Fuente <toni@blyx.com>

* Update README.md

Co-authored-by: Toni de la Fuente <toni@blyx.com>

Co-authored-by: Toni de la Fuente <toni@blyx.com>
2022-07-08 12:42:46 +02:00
Toni de la Fuente
f691046c1f feat(inventory): Prowler quick inventory (#1245)
* chore(inventory): option included in main

* chore(inventory): quick inventory

* chore(inventory): functional version

* chore(inventory): functional version without echo

* Update include/quick_inventory

Co-authored-by: Pepe Fagoaga <pepe@verica.io>

* Update prowler

Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>

* Added new line at report line

Co-authored-by: Pepe Fagoaga <pepe@verica.io>
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2022-07-08 12:41:54 +02:00
Pepe Fagoaga
9fad8735b8 fix(Dockerfile): Prowler path (#1254) 2022-07-07 10:03:07 +02:00
Pepe Fagoaga
c632055517 fix(dockerfile): Python path (#1250) 2022-07-06 07:54:37 +02:00
Sergio Garcia
fd850790d5 fix(add-checks-regions): Missing regions in checks (#1247)
* add regions to checks

* add root as resource

Co-authored-by: sergargar <sergio@verica.io>
2022-07-04 09:46:08 +02:00
Sergio Garcia
912d5d7f8c fix(postgres): Fix postgres connector issues. (#1244)
* fix(postgres): Fix postgres connector issues.

* fix(postgres): Update documentation

Co-authored-by: sergargar <sergio@verica.io>
2022-06-30 18:12:33 +02:00
Pepe Fagoaga
d88a136ac3 feat(db-connector): Include env variables (#1236)
* feat(db-connector): Include env variables

* fix(typo)

* fix(psql-test): Remove PGPASSWORD
2022-06-30 08:43:41 +02:00
Pepe Fagoaga
172484cf08 feat(dockerfile): Include psql client in the Prowler scanner image (#1238)
* fix(dockerignore): Include files

* fix(dockerfile): Keep python2 and organize

* feat(db-connector): Include postgres dependencies

* feat(dockerfile): Include hadolint pre-commit
2022-06-30 08:28:29 +02:00
Pepe Fagoaga
821083639a fix(bckCredentials): Do nothing if no initial creds (#1239) 2022-06-29 16:52:08 +02:00
rajarshidas
e4f0f3ec87 feat(check): Ensure default internet access from Amazon AppStream fleet should be disabled. (#1233)
Co-authored-by: sergargar <sergio@verica.io>
2022-06-29 12:51:58 +02:00
rajarshidas
cc6302f7b8 feat(checks): Amazon AppStream checks (#1216)
Co-authored-by: sergargar <sergio@verica.io>
2022-06-29 12:31:42 +02:00
Bayron Carranza
c89fd82856 feat(check7164): 365 days or more in a Cloudwatch log retention should be consider PASS (#1240)
* 365 DAYS or More Retention log group in cloudwatch

* fix(extra7162): Fix comparison errors

Also include minor changes to texts

* fix(extra7162): Set as Pass log groups that never expires

* fix(typo)

Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2022-06-28 08:58:41 +02:00
Pepe Fagoaga
0e29a92d42 fix(extra7162): Query AWS log groups using LOG_GROUP_RETENTION_PERIOD_DAYS (#1232) 2022-06-27 09:18:39 +02:00
Sergio Garcia
835d8ffe5d feat(Actions): Update refresh_aws_services_regions.yml (#1227) 2022-06-23 11:21:50 +02:00
Sergio Garcia
21ee2068a6 feat(actions): Create refresh_aws_services_regions.yml (#1225) 2022-06-23 11:07:26 +02:00
Sergio Garcia
0ad149942b fix(security_hub_integration): Treat failed findings as failed in Security Hub (#1219)
Co-authored-by: sergargar <sergio@verica.io>
2022-06-22 14:03:16 +02:00
Nacho Rivera
66305768c0 fix(instance metadata): Missing raw flag in JQ parser (#1214) 2022-06-21 10:14:12 +02:00
Sergio Garcia
05f98fe993 fix(junit_xml output): Fix XML output integration (#1210)
Co-authored-by: sergargar <sergio@verica.io>
2022-06-20 13:27:54 +02:00
rajarshidas
89416f37af feat(check): Directory Service - Ensure Radius server is using the recommended security protocol (#1203)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2022-06-20 11:37:02 +02:00
Pepe Fagoaga
7285ddcb4e feat(actions): Trigger (#1209) 2022-06-20 10:38:19 +02:00
Pepe Fagoaga
8993a4f707 fix(actions): Dockerfile path (#1208) 2022-06-20 09:22:40 +02:00
Sergio Garcia
633d7bd8a8 fix(instance-metadata): Credentials recovering (#1207)
* fix(instance-metadata): Credentials recovering

* fix(expr): Dockerfile to root and expr in SESSION_TIME_REMAINING.

Co-authored-by: Pepe Fagoaga <pepe@verica.io>
Co-authored-by: sergargar <sergio@verica.io>
2022-06-17 14:23:56 +02:00
Pepe Fagoaga
3944ea2055 fix(session_duration): Use jq with TZ=UTC (#1195) 2022-06-15 13:25:43 +02:00
zsecducna
d85d0f5877 fix(extra767): Remove false positive (#1198)
* Remove fail positive

Exclude distributions that does not support `POST` requests

* fix(extra767): Overall changes

- Quoted and braced variables
- Fix DefaultCacheBehavior twice in a AWS CLI query
- Use regex =~ to match values

* fix(check767): Change textInfo for textPass

* fix(extra767): Include AWS CLI error handling

Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2022-06-15 09:38:56 +02:00
Pepe Fagoaga
d32a7986a5 fix(shellcheck): Main variables (#1194) 2022-06-14 10:43:15 +02:00
Pepe Fagoaga
71813425bd fix(pre-commit): Recover shellcheck (#1193) 2022-06-14 07:46:12 +02:00
Pepe Fagoaga
da000b54ca refactor(Prowler): Main logic refactor (#1189)
* fix(aws_profile_loader): New functions

* fix(shellcheck): Temporary remove Shellcheck

* fix(aws_cli_detector): new function

* fix(jq_detector): New function

* fix(os_detector): New function

* fix(output_bucket): Output bucket input check in main

* fix(python_detector): deleted unused python detector

* fix(credentials): credentials check out of whoami

* [break]refactor(main)

* [BREAK] Get list of checks parsing all input options

* [break]refactor(main): execute checks functions

* [break]refactor(main): move functions to libs

* fix(validations): custom check validation and typos

* refactor(validate_options): Include comments

* fix(custom_checks): Minor fixes

* refactor(closing_files): include libraries

* refactor(loader): Include ignored checks

* refactor(main): Fix shellcheck

* refactor(loader): beautify

* refactor(monochrome): without variables

* refactor(modes): MODES array not needed

* refactor(whoami): get error from AWSCLI

* refactor(secrets-detector)

* refactor(secrets-detector)

* fix(html_scoring): html scoring was fixed.

* fix(load_checks_from_file)

* fix(color-code): Print if not mono

* fix(not extra): Fixed if EXCLUDE_CHECK_ID is empty

* fix(IFS): Restore default IFS once modes are parsed

* fix(bucket): validate before whoami

* fix(bucket): validate before whoami

Co-authored-by: n4ch04 <nachor1992@gmail.com>
Co-authored-by: sergargar <sergio@verica.io>
Co-authored-by: Nacho Rivera <59198746+n4ch04@users.noreply.github.com>
2022-06-13 17:34:31 +02:00
Sergio Garcia
74a9b42d9f Update codebuild-prowler-audit-account-cfn.yaml (#1192) 2022-06-13 12:17:31 +02:00
Nacho Rivera
f9322ab3aa fix(outputs): Replace each comma occurrence before sending to csv file (#1188) 2022-06-08 09:19:50 +02:00
Pepe Fagoaga
5becaca2c4 fix(extra7187): Remove commas from the metadata (#1187) 2022-06-08 09:02:38 +02:00
Sergio Garcia
50a670fbc4 fix(codebuild_update): AWS CLI and permissions update. (#1183) 2022-06-07 14:49:22 +02:00
Sergio Garcia
48f405a696 fix(check119_remediation): Update check remediation text. (#1185) 2022-06-07 14:48:13 +02:00
Nacho Rivera
bc56c4242e refactor(outputs): Consolidate Prowler output functions (#1180)
* chore(db providers): db providers first version

* chore(db provider): added db provider setup into Readme

* fix(csv_line): csv_line out of conditional

* fix(README): text instead of varchar in table

* fix(help): help message extended

Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>

* fix(typo): Update README.md

Co-authored-by: Pepe Fagoaga <pepe@verica.io>

* fix(table): add if not exists

Co-authored-by: Pepe Fagoaga <pepe@verica.io>

* fix(typo): Readme postgreSQL

Co-authored-by: Pepe Fagoaga <pepe@verica.io>

* fix(db_connector): details to add a new provider

* fix(typo): Uppercase Prowler

Co-authored-by: Toni de la Fuente <toni@blyx.com>

* fix(prowler): deleted unused variable

* chore(checks): test db connector previous to send data

* chore(input tests): input tests moved to main

* fix(typo): Readme typos

* chore(table): table name from pgpass file

* fix(grep test): Added missing -E flag

* chore(table): check of table name and Readme

* chore(error colors): Added error colors

* chore(inputcheck): checks about mode and output inputs into main

* fix(inputs) custom output file name

* fix(outputs): comment profile

* chore(textXXX): both 3 textfunctions using general

* fix(allowlist): allowlist check included as function

* fix(headers): Add headers to certain output files

* fix(reformulate): change structure and delete comments

* fix(testing): Input test after load includes

* fix(variables): Added named vars

* fix(colors): Deleted unused colors

* fix(outputs): fine tuning

* fix(outputs): allowlist parameters read

* fix(allowlist): allowlist logic reformulated

* fix(REPREGION): REPREGION change by REGION_FROM_CHECK

Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
Co-authored-by: Toni de la Fuente <toni@blyx.com>
2022-06-06 12:56:21 +02:00
Pepe Fagoaga
1b63256b9c fix(assume_role): Use date instead of jq (#1181)
* fix(date): Use  instead of date

* fix(assume_role): Use date instead of jq

JQ parses datetimes using the local timezone and not UTC
2022-06-03 08:31:43 -07:00
Sergio Garcia
7930b449b3 fix(apigateway_iam): Error handling and permissions for extra745. (#1176)
* fix(apigateway_iam): Error handling and permissions for extra745.

* Update check_extra745

Co-authored-by: sergargar <sergio@verica.io>
2022-06-02 15:16:43 +02:00
Pepe Fagoaga
e5cd42da55 fix(typo): Max session duration error message (#1179) 2022-06-02 15:08:30 +02:00
Sergio Garcia
2a54bbf901 fix(SQS_encryption_type): Add SQS encryption types to extra728. (#1175)
* fix(SQS_encryption_type): Add SQS encryption types to extra728.

* Update check_extra728

* Update check_extra728

Co-authored-by: sergargar <sergio@verica.io>
2022-06-02 15:01:02 +02:00
Nacho Rivera
2e134ed947 feat(db_connector): Create a PostgreSQL connector for Prowler (#1171)
* chore(db providers): db providers first version

* chore(db provider): added db provider setup into Readme

* fix(csv_line): csv_line out of conditional

* fix(README): text instead of varchar in table

* fix(help): help message extended

Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>

* fix(typo): Update README.md

Co-authored-by: Pepe Fagoaga <pepe@verica.io>

* fix(table): add if not exists

Co-authored-by: Pepe Fagoaga <pepe@verica.io>

* fix(typo): Readme postgreSQL

Co-authored-by: Pepe Fagoaga <pepe@verica.io>

* fix(db_connector): details to add a new provider

* fix(typo): Uppercase Prowler

Co-authored-by: Toni de la Fuente <toni@blyx.com>

* fix(prowler): deleted unused variable

* chore(checks): test db connector previous to send data

* chore(input tests): input tests moved to main

* fix(typo): Readme typos

* chore(table): table name from pgpass file

* fix(grep test): Added missing -E flag

* chore(table): check of table name and Readme

* chore(error colors): Added error colors

* fix(tablename): table name in readme

* fix(typo)

* fix(db_provider): Exact match

* fix(error): One line message

* chore(pgpass check): Check added for pgpass file

* fix(pgpass): pgpass file and permissions test

* fix(unused vars): Deleted unused vars

* fix(TOP_PID): Deleted TOP_PID unused var and comment

* chore(db tests): Credentials, database and table tests added

* fix(empty pgpass): Look for empty fields at pgpass file

Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
Co-authored-by: Toni de la Fuente <toni@blyx.com>
2022-06-02 13:15:14 +02:00
Sergio Garcia
ba727391db fix(runtimes_extra762): Detect nodejs versions correctly. (#1177)
Co-authored-by: sergargar <sergio@verica.io>
2022-06-02 13:14:22 +02:00
Sergio Garcia
d4346149fa fix(severity): High severity for check extra7185 (#1178) 2022-06-01 14:04:36 +02:00
Pepe Fagoaga
2637fc5132 feat(checks): New IAM privilege escalation check (#1168) 2022-06-01 13:58:31 +02:00
Sergio Garcia
ac5135470b fix(update_deprecate_runtimes): Deprecated runtimes for lambda were updated (#1170) 2022-05-31 17:03:11 +02:00
rajarshidas
613966aecf feat(check): Amazon WorkSpaces storage volumes are encrypted
If the value listed in the Volume Encryption column is Disabled, the selected AWS WorkSpaces instance volumes (root and user volumes) are not encrypted
2022-05-31 17:01:20 +02:00
Pepe Fagoaga
83ddcb9c39 feat(check): PublicAccessBlockConfiguration (#1167) 2022-05-31 16:54:05 +02:00
Lucas L Lopes
957c2433cf feat(checks): New checks for Directory Service (#1164) 2022-05-30 14:24:44 +02:00
Pepe Fagoaga
c10b367070 fix(actions): Bad PRO repository (#1163) 2022-05-25 12:47:22 +02:00
5861 changed files with 21650 additions and 672800 deletions

View File

@@ -1,14 +0,0 @@
{
"repoOwner": "prowler-cloud",
"repoName": "prowler",
"targetPRLabels": [
"backport"
],
"sourcePRLabels": [
"was-backported"
],
"copySourcePRLabels": false,
"copySourcePRReviewers": true,
"prTitle": "{{sourcePullRequest.title}}",
"commitConflicts": true
}

17
.dockerignore Normal file
View File

@@ -0,0 +1,17 @@
# Ignore git files
.git/
.github/
# Ignore Dodckerfile
Dockerfile
# Ignore hidden files
.pre-commit-config.yaml
.dockerignore
.gitignore
.pytest*
.DS_Store
# Ignore output directories
output/
junit-reports/

135
.env
View File

@@ -1,135 +0,0 @@
#### Important Note ####
# This file is used to store environment variables for the Prowler App.
# For production, it is recommended to use a secure method to store these variables and change the default secret keys.
#### Prowler UI Configuration ####
PROWLER_UI_VERSION="stable"
AUTH_URL=http://localhost:3000
API_BASE_URL=http://prowler-api:8080/api/v1
NEXT_PUBLIC_API_DOCS_URL=http://prowler-api:8080/api/v1/docs
AUTH_TRUST_HOST=true
UI_PORT=3000
# openssl rand -base64 32
AUTH_SECRET="N/c6mnaS5+SWq81+819OrzQZlmx1Vxtp/orjttJSmw8="
#### Prowler API Configuration ####
PROWLER_API_VERSION="stable"
# PostgreSQL settings
# If running Django and celery on host, use 'localhost', else use 'postgres-db'
POSTGRES_HOST=postgres-db
POSTGRES_PORT=5432
POSTGRES_ADMIN_USER=prowler_admin
POSTGRES_ADMIN_PASSWORD=postgres
POSTGRES_USER=prowler
POSTGRES_PASSWORD=postgres
POSTGRES_DB=prowler_db
# Valkey settings
# If running Valkey and celery on host, use localhost, else use 'valkey'
VALKEY_HOST=valkey
VALKEY_PORT=6379
VALKEY_DB=0
# API scan settings
# The path to the directory where scan output should be stored
DJANGO_TMP_OUTPUT_DIRECTORY="/tmp/prowler_api_output"
# The maximum number of findings to process in a single batch
DJANGO_FINDINGS_BATCH_SIZE=1000
# The AWS access key to be used when uploading scan output to an S3 bucket
# If left empty, default AWS credentials resolution behavior will be used
DJANGO_OUTPUT_S3_AWS_ACCESS_KEY_ID=""
# The AWS secret key to be used when uploading scan output to an S3 bucket
DJANGO_OUTPUT_S3_AWS_SECRET_ACCESS_KEY=""
# An optional AWS session token
DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN=""
# The AWS region where your S3 bucket is located (e.g., "us-east-1")
DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION=""
# The name of the S3 bucket where scan output should be stored
DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET=""
# Django settings
DJANGO_ALLOWED_HOSTS=localhost,127.0.0.1,prowler-api
DJANGO_BIND_ADDRESS=0.0.0.0
DJANGO_PORT=8080
DJANGO_DEBUG=False
DJANGO_SETTINGS_MODULE=config.django.production
# Select one of [ndjson|human_readable]
DJANGO_LOGGING_FORMATTER=human_readable
# Select one of [DEBUG|INFO|WARNING|ERROR|CRITICAL]
# Applies to both Django and Celery Workers
DJANGO_LOGGING_LEVEL=INFO
# Defaults to the maximum available based on CPU cores if not set.
DJANGO_WORKERS=4
# Token lifetime is in minutes
DJANGO_ACCESS_TOKEN_LIFETIME=30
# Token lifetime is in minutes
DJANGO_REFRESH_TOKEN_LIFETIME=1440
DJANGO_CACHE_MAX_AGE=3600
DJANGO_STALE_WHILE_REVALIDATE=60
DJANGO_MANAGE_DB_PARTITIONS=True
# openssl genrsa -out private.pem 2048
DJANGO_TOKEN_SIGNING_KEY="-----BEGIN PRIVATE KEY-----
MIIEvQIBADANBgkqhkiG9w0BAQEFAASCBKcwggSjAgEAAoIBAQDs4e+kt7SnUJek
6V5r9zMGzXCoU5qnChfPiqu+BgANyawz+MyVZPs6RCRfeo6tlCknPQtOziyXYM2I
7X+qckmuzsjqp8+u+o1mw3VvUuJew5k2SQLPYwsiTzuFNVJEOgRo3hywGiGwS2iv
/5nh2QAl7fq2qLqZEXQa5+/xJlQggS1CYxOJgggvLyra50QZlBvPve/AxKJ/EV/Q
irWTZU5lLNI8sH2iZR05vQeBsxZ0dCnGMT+vGl+cGkqrvzQzKsYbDmabMcfTYhYi
78fpv6A4uharJFHayypYBjE39PwhMyyeycrNXlpm1jpq+03HgmDuDMHydk1tNwuT
nEC7m7iNAgMBAAECggEAA2m48nJcJbn9SVi8bclMwKkWmbJErOnyEGEy2sTK3Of+
NWx9BB0FmqAPNxn0ss8K7cANKOhDD7ZLF9E2MO4/HgfoMKtUzHRbM7MWvtEepldi
nnvcUMEgULD8Dk4HnqiIVjt3BdmGiTv46OpBnRWrkSBV56pUL+7msZmMZTjUZvh2
ZWv0+I3gtDIjo2Zo/FiwDV7CfwRjJarRpYUj/0YyuSA4FuOUYl41WAX1I301FKMH
xo3jiAYi1s7IneJ16OtPpOA34Wg5F6ebm/UO0uNe+iD4kCXKaZmxYQPh5tfB0Qa3
qj1T7GNpFNyvtG7VVdauhkb8iu8X/wl6PCwbg0RCKQKBgQD9HfpnpH0lDlHMRw9K
X7Vby/1fSYy1BQtlXFEIPTN/btJ/asGxLmAVwJ2HAPXWlrfSjVAH7CtVmzN7v8oj
HeIHfeSgoWEu1syvnv2AMaYSo03UjFFlfc/GUxF7DUScRIhcJUPCP8jkAROz9nFv
DByNjUL17Q9r43DmDiRsy0IFqQKBgQDvlJ9Uhl+Sp7gRgKYwa/IG0+I4AduAM+Gz
Dxbm52QrMGMTjaJFLmLHBUZ/ot+pge7tZZGws8YR8ufpyMJbMqPjxhIvRRa/p1Tf
E3TQPW93FMsHUvxAgY3MV5MzXFPhlNAKb+akP/RcXUhetGAuZKLubtDCWa55ZQuL
wj2OS+niRQKBgE7K8zUqNi6/22S8xhy/2GPgB1qPObbsABUofK0U6CAGLo6te+gc
6Jo84IyzFtQbDNQFW2Fr+j1m18rw9AqkdcUhQndiZS9AfG07D+zFB86LeWHt4DS4
ymIRX8Kvaak/iDcu/n3Mf0vCrhB6aetImObTj4GgrwlFvtJOmrYnO8EpAoGAIXXP
Xt25gWD9OyyNiVu6HKwA/zN7NYeJcRmdaDhO7B1A6R0x2Zml4AfjlbXoqOLlvLAf
zd79vcoAC82nH1eOPiSOq51plPDI0LMF8IN0CtyTkn1Lj7LIXA6rF1RAvtOqzppc
SvpHpZK9pcRpXnFdtBE0BMDDtl6fYzCIqlP94UUCgYEAnhXbAQMF7LQifEm34Dx8
BizRMOKcqJGPvbO2+Iyt50O5X6onU2ITzSV1QHtOvAazu+B1aG9pEuBFDQ+ASxEu
L9ruJElkOkb/o45TSF6KCsHd55ReTZ8AqnRjf5R+lyzPqTZCXXb8KTcRvWT4zQa3
VxyT2PnaSqEcexWUy4+UXoQ=
-----END PRIVATE KEY-----"
# openssl rsa -in private.pem -pubout -out public.pem
DJANGO_TOKEN_VERIFYING_KEY="-----BEGIN PUBLIC KEY-----
MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEA7OHvpLe0p1CXpOlea/cz
Bs1wqFOapwoXz4qrvgYADcmsM/jMlWT7OkQkX3qOrZQpJz0LTs4sl2DNiO1/qnJJ
rs7I6qfPrvqNZsN1b1LiXsOZNkkCz2MLIk87hTVSRDoEaN4csBohsEtor/+Z4dkA
Je36tqi6mRF0Gufv8SZUIIEtQmMTiYIILy8q2udEGZQbz73vwMSifxFf0Iq1k2VO
ZSzSPLB9omUdOb0HgbMWdHQpxjE/rxpfnBpKq780MyrGGw5mmzHH02IWIu/H6b+g
OLoWqyRR2ssqWAYxN/T8ITMsnsnKzV5aZtY6avtNx4Jg7gzB8nZNbTcLk5xAu5u4
jQIDAQAB
-----END PUBLIC KEY-----"
# openssl rand -base64 32
DJANGO_SECRETS_ENCRYPTION_KEY="oE/ltOhp/n1TdbHjVmzcjDPLcLA41CVI/4Rk+UB5ESc="
DJANGO_BROKER_VISIBILITY_TIMEOUT=86400
DJANGO_SENTRY_DSN=
# Sentry settings
SENTRY_ENVIRONMENT=local
SENTRY_RELEASE=local
#### Prowler release version ####
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.6.0
# Social login credentials
SOCIAL_GOOGLE_OAUTH_CALLBACK_URL="${AUTH_URL}/api/auth/callback/google"
SOCIAL_GOOGLE_OAUTH_CLIENT_ID=""
SOCIAL_GOOGLE_OAUTH_CLIENT_SECRET=""
SOCIAL_GITHUB_OAUTH_CALLBACK_URL="${AUTH_URL}/api/auth/callback/github"
SOCIAL_GITHUB_OAUTH_CLIENT_ID=""
SOCIAL_GITHUB_OAUTH_CLIENT_SECRET=""

7
.github/CODEOWNERS vendored
View File

@@ -1,6 +1 @@
/* @prowler-cloud/sdk
/.github/ @prowler-cloud/sdk
prowler @prowler-cloud/sdk @prowler-cloud/detection-and-remediation
tests @prowler-cloud/sdk @prowler-cloud/detection-and-remediation
api @prowler-cloud/api
ui @prowler-cloud/ui
* @prowler-cloud/prowler-team

50
.github/ISSUE_TEMPLATE/bug_report.md vendored Normal file
View File

@@ -0,0 +1,50 @@
---
name: Bug report
about: Create a report to help us improve
title: "[Bug]: "
labels: bug, status/needs-triage
assignees: ''
---
<!--
Please use this template to create your bug report. By providing as much info as possible you help us understand the issue, reproduce it and resolve it for you quicker. Therefore, take a couple of extra minutes to make sure you have provided all info needed.
PROTIP: record your screen and attach it as a gif to showcase the issue.
- How to record and attach gif: https://bit.ly/2Mi8T6K
-->
**What happened?**
A clear and concise description of what the bug is or what is not working as expected
**How to reproduce it**
Steps to reproduce the behavior:
1. What command are you running?
2. Environment you have, like single account, multi-account, organizations, etc.
3. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots or Logs**
If applicable, add screenshots to help explain your problem.
Also, you can add logs (anonymize them first!). Here a command that may help to share a log
`bash -x ./prowler -options > debug.log 2>&1` then attach here `debug.log`
**From where are you running Prowler?**
Please, complete the following information:
- Resource: [e.g. EC2 instance, Fargate task, Docker container manually, EKS, Cloud9, CodeBuild, workstation, etc.)
- OS: [e.g. Amazon Linux 2, Mac, Alpine, Windows, etc. ]
- AWS-CLI Version [`aws --version`]:
- Prowler Version [`./prowler -V`]:
- Shell and version:
- Others:
**Additional context**
Add any other context about the problem here.

View File

@@ -1,96 +0,0 @@
name: 🐞 Bug Report
description: Create a report to help us improve
labels: ["bug", "status/needs-triage"]
body:
- type: textarea
id: reproduce
attributes:
label: Steps to Reproduce
description: Steps to reproduce the behavior
placeholder: |-
1. What command are you running?
2. Cloud provider you are launching
3. Environment you have, like single account, multi-account, organizations, multi or single subscription, etc.
4. See error
validations:
required: true
- type: textarea
id: expected
attributes:
label: Expected behavior
description: A clear and concise description of what you expected to happen.
validations:
required: true
- type: textarea
id: actual
attributes:
label: Actual Result with Screenshots or Logs
description: If applicable, add screenshots to help explain your problem. Also, you can add logs (anonymize them first!). Here a command that may help to share a log `prowler <your arguments> --log-level ERROR --log-file $(date +%F)_error.log` then attach here the log file.
validations:
required: true
- type: dropdown
id: type
attributes:
label: How did you install Prowler?
options:
- Cloning the repository from github.com (git clone)
- From pip package (pip install prowler)
- From brew (brew install prowler)
- Docker (docker pull toniblyx/prowler)
validations:
required: true
- type: textarea
id: environment
attributes:
label: Environment Resource
description: From where are you running Prowler?
placeholder: |-
1. EC2 instance
2. Fargate task
3. Docker container locally
4. EKS
5. Cloud9
6. CodeBuild
7. Workstation
8. Other(please specify)
validations:
required: true
- type: textarea
id: os
attributes:
label: OS used
description: Which OS are you using?
placeholder: |-
1. Amazon Linux 2
2. MacOS
3. Alpine Linux
4. Windows
5. Other(please specify)
validations:
required: true
- type: input
id: prowler-version
attributes:
label: Prowler version
description: Which Prowler version are you using?
placeholder: |-
prowler --version
validations:
required: true
- type: input
id: pip-version
attributes:
label: Pip version
description: Which pip version are you using?
placeholder: |-
pip --version
validations:
required: true
- type: textarea
id: additional
attributes:
description: Additional context
label: Context
validations:
required: false

View File

@@ -1 +1,5 @@
blank_issues_enabled: false
contact_links:
- name: Questions & Help
url: https://github.com/prowler-cloud/prowler/discussions/categories/q-a
about: Please ask and answer questions here.

View File

@@ -1,35 +0,0 @@
name: 💡 Feature Request
description: Suggest an idea for this project
labels: ["feature-request", "status/needs-triage"]
body:
- type: textarea
id: Problem
attributes:
label: New feature motivation
description: Is your feature request related to a problem? Please describe
placeholder: |-
1. A clear and concise description of what the problem is. Ex. I'm always frustrated when
validations:
required: true
- type: textarea
id: Solution
attributes:
label: Solution Proposed
description: A clear and concise description of what you want to happen.
validations:
required: true
- type: textarea
id: Alternatives
attributes:
label: Describe alternatives you've considered
description: A clear and concise description of any alternative solutions or features you've considered.
validations:
required: true
- type: textarea
id: Context
attributes:
label: Additional context
description: Add any other context or screenshots about the feature request here.
validations:
required: false

View File

@@ -0,0 +1,20 @@
---
name: Feature request
about: Suggest an idea for this project
title: ''
labels: enhancement, status/needs-triage
assignees: ''
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.

View File

@@ -1,3 +0,0 @@
name: "API - CodeQL Config"
paths:
- "api/"

View File

@@ -1,4 +0,0 @@
name: "SDK - CodeQL Config"
paths-ignore:
- "api/"
- "ui/"

View File

@@ -1,3 +0,0 @@
name: "UI - CodeQL Config"
paths:
- "ui/"

120
.github/dependabot.yml vendored
View File

@@ -1,120 +0,0 @@
# To get started with Dependabot version updates, you'll need to specify which
# package ecosystems to update and where the package manifests are located.
# Please see the documentation for all configuration options:
# https://docs.github.com/github/administering-a-repository/configuration-options-for-dependency-updates
version: 2
updates:
# v5
- package-ecosystem: "pip"
directory: "/"
schedule:
interval: "monthly"
open-pull-requests-limit: 25
target-branch: master
labels:
- "dependencies"
- "pip"
# Dependabot Updates are temporary disabled - 2025/03/19
# - package-ecosystem: "pip"
# directory: "/api"
# schedule:
# interval: "daily"
# open-pull-requests-limit: 10
# target-branch: master
# labels:
# - "dependencies"
# - "pip"
# - "component/api"
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "monthly"
open-pull-requests-limit: 25
target-branch: master
labels:
- "dependencies"
- "github_actions"
# Dependabot Updates are temporary disabled - 2025/03/19
# - package-ecosystem: "npm"
# directory: "/ui"
# schedule:
# interval: "daily"
# open-pull-requests-limit: 10
# target-branch: master
# labels:
# - "dependencies"
# - "npm"
# - "component/ui"
- package-ecosystem: "docker"
directory: "/"
schedule:
interval: "monthly"
open-pull-requests-limit: 25
target-branch: master
labels:
- "dependencies"
- "docker"
# Dependabot Updates are temporary disabled - 2025/04/15
# v4.6
# - package-ecosystem: "pip"
# directory: "/"
# schedule:
# interval: "weekly"
# open-pull-requests-limit: 10
# target-branch: v4.6
# labels:
# - "dependencies"
# - "pip"
# - "v4"
# - package-ecosystem: "github-actions"
# directory: "/"
# schedule:
# interval: "weekly"
# open-pull-requests-limit: 10
# target-branch: v4.6
# labels:
# - "dependencies"
# - "github_actions"
# - "v4"
# - package-ecosystem: "docker"
# directory: "/"
# schedule:
# interval: "weekly"
# open-pull-requests-limit: 10
# target-branch: v4.6
# labels:
# - "dependencies"
# - "docker"
# - "v4"
# Dependabot Updates are temporary disabled - 2025/03/19
# v3
# - package-ecosystem: "pip"
# directory: "/"
# schedule:
# interval: "monthly"
# open-pull-requests-limit: 10
# target-branch: v3
# labels:
# - "dependencies"
# - "pip"
# - "v3"
# - package-ecosystem: "github-actions"
# directory: "/"
# schedule:
# interval: "monthly"
# open-pull-requests-limit: 10
# target-branch: v3
# labels:
# - "dependencies"
# - "github_actions"
# - "v3"

104
.github/labeler.yml vendored
View File

@@ -1,104 +0,0 @@
documentation:
- changed-files:
- any-glob-to-any-file: "docs/**"
provider/aws:
- changed-files:
- any-glob-to-any-file: "prowler/providers/aws/**"
- any-glob-to-any-file: "tests/providers/aws/**"
provider/azure:
- changed-files:
- any-glob-to-any-file: "prowler/providers/azure/**"
- any-glob-to-any-file: "tests/providers/azure/**"
provider/gcp:
- changed-files:
- any-glob-to-any-file: "prowler/providers/gcp/**"
- any-glob-to-any-file: "tests/providers/gcp/**"
provider/kubernetes:
- changed-files:
- any-glob-to-any-file: "prowler/providers/kubernetes/**"
- any-glob-to-any-file: "tests/providers/kubernetes/**"
provider/github:
- changed-files:
- any-glob-to-any-file: "prowler/providers/github/**"
- any-glob-to-any-file: "tests/providers/github/**"
github_actions:
- changed-files:
- any-glob-to-any-file: ".github/workflows/*"
cli:
- changed-files:
- any-glob-to-any-file: "cli/**"
mutelist:
- changed-files:
- any-glob-to-any-file: "prowler/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/aws/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/azure/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/gcp/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/kubernetes/lib/mutelist/**"
- any-glob-to-any-file: "tests/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/aws/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/azure/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/gcp/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/kubernetes/lib/mutelist/**"
integration/s3:
- changed-files:
- any-glob-to-any-file: "prowler/providers/aws/lib/s3/**"
- any-glob-to-any-file: "tests/providers/aws/lib/s3/**"
integration/slack:
- changed-files:
- any-glob-to-any-file: "prowler/lib/outputs/slack/**"
- any-glob-to-any-file: "tests/lib/outputs/slack/**"
integration/security-hub:
- changed-files:
- any-glob-to-any-file: "prowler/providers/aws/lib/security_hub/**"
- any-glob-to-any-file: "tests/providers/aws/lib/security_hub/**"
- any-glob-to-any-file: "prowler/lib/outputs/asff/**"
- any-glob-to-any-file: "tests/lib/outputs/asff/**"
output/html:
- changed-files:
- any-glob-to-any-file: "prowler/lib/outputs/html/**"
- any-glob-to-any-file: "tests/lib/outputs/html/**"
output/asff:
- changed-files:
- any-glob-to-any-file: "prowler/lib/outputs/asff/**"
- any-glob-to-any-file: "tests/lib/outputs/asff/**"
output/ocsf:
- changed-files:
- any-glob-to-any-file: "prowler/lib/outputs/ocsf/**"
- any-glob-to-any-file: "tests/lib/outputs/ocsf/**"
output/csv:
- changed-files:
- any-glob-to-any-file: "prowler/lib/outputs/csv/**"
- any-glob-to-any-file: "tests/lib/outputs/csv/**"
component/api:
- changed-files:
- any-glob-to-any-file: "api/**"
component/ui:
- changed-files:
- any-glob-to-any-file: "ui/**"
compliance:
- changed-files:
- any-glob-to-any-file: "prowler/compliance/**"
- any-glob-to-any-file: "prowler/lib/outputs/compliance/**"
- any-glob-to-any-file: "tests/lib/outputs/compliance/**"
review-django-migrations:
- changed-files:
- any-glob-to-any-file: "api/src/backend/api/migrations/**"

View File

@@ -1,27 +1,12 @@
### Context
### Context
Please include relevant motivation and context for this PR.
If fixes an issue please add it with `Fix #XXXX`
### Description
Please include a summary of the change and which issue is fixed. List any dependencies that are required for this change.
### Checklist
- Are there new checks included in this PR? Yes / No
- If so, do we need to update permissions for the provider? Please review this carefully.
- [ ] Review if the code is being covered by tests.
- [ ] Review if code is being documented following this specification https://github.com/google/styleguide/blob/gh-pages/pyguide.md#38-comments-and-docstrings
- [ ] Review if backport is needed.
- [ ] Review if is needed to change the [Readme.md](https://github.com/prowler-cloud/prowler/blob/master/README.md)
- [ ] Ensure new entries are added to [CHANGELOG.md](https://github.com/prowler-cloud/prowler/blob/master/prowler/CHANGELOG.md), if applicable.
#### API
- [ ] Verify if API specs need to be regenerated.
- [ ] Check if version updates are required (e.g., specs, Poetry, etc.).
- [ ] Ensure new entries are added to [CHANGELOG.md](https://github.com/prowler-cloud/prowler/blob/master/api/CHANGELOG.md), if applicable.
### License

View File

@@ -1,114 +0,0 @@
name: API - Build and Push containers
on:
push:
branches:
- "master"
paths:
- "api/**"
- ".github/workflows/api-build-lint-push-containers.yml"
# Uncomment the code below to test this action on PRs
# pull_request:
# branches:
# - "master"
# paths:
# - "api/**"
# - ".github/workflows/api-build-lint-push-containers.yml"
release:
types: [published]
env:
# Tags
LATEST_TAG: latest
RELEASE_TAG: ${{ github.event.release.tag_name }}
STABLE_TAG: stable
WORKING_DIRECTORY: ./api
# Container Registries
PROWLERCLOUD_DOCKERHUB_REPOSITORY: prowlercloud
PROWLERCLOUD_DOCKERHUB_IMAGE: prowler-api
jobs:
repository-check:
name: Repository check
runs-on: ubuntu-latest
outputs:
is_repo: ${{ steps.repository_check.outputs.is_repo }}
steps:
- name: Repository check
id: repository_check
working-directory: /tmp
run: |
if [[ ${{ github.repository }} == "prowler-cloud/prowler" ]]
then
echo "is_repo=true" >> "${GITHUB_OUTPUT}"
else
echo "This action only runs for prowler-cloud/prowler"
echo "is_repo=false" >> "${GITHUB_OUTPUT}"
fi
# Build Prowler OSS container
container-build-push:
needs: repository-check
if: needs.repository-check.outputs.is_repo == 'true'
runs-on: ubuntu-latest
defaults:
run:
working-directory: ${{ env.WORKING_DIRECTORY }}
steps:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set short git commit SHA
id: vars
run: |
shortSha=$(git rev-parse --short ${{ github.sha }})
echo "SHORT_SHA=${shortSha}" >> $GITHUB_ENV
- name: Login to DockerHub
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Build and push container image (latest)
# Comment the following line for testing
if: github.event_name == 'push'
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
context: ${{ env.WORKING_DIRECTORY }}
# Set push: false for testing
push: true
tags: |
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ env.LATEST_TAG }}
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ env.SHORT_SHA }}
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
context: ${{ env.WORKING_DIRECTORY }}
push: true
tags: |
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ env.RELEASE_TAG }}
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ env.STABLE_TAG }}
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Trigger deployment
if: github.event_name == 'push'
uses: peter-evans/repository-dispatch@ff45666b9427631e3450c54a1bcbee4d9ff4d7c0 # v3.0.0
with:
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
repository: ${{ secrets.CLOUD_DISPATCH }}
event-type: prowler-api-deploy
client-payload: '{"sha": "${{ github.sha }}", "short_sha": "${{ env.SHORT_SHA }}"}'

View File

@@ -1,59 +0,0 @@
# For most projects, this workflow file will not need changing; you simply need
# to commit it to your repository.
#
# You may wish to alter this file to override the set of languages analyzed,
# or to provide custom queries or build logic.
#
# ******** NOTE ********
# We have attempted to detect the languages in your repository. Please check
# the `language` matrix defined below to confirm you have the correct set of
# supported CodeQL languages.
#
name: API - CodeQL
on:
push:
branches:
- "master"
- "v5.*"
paths:
- "api/**"
pull_request:
branches:
- "master"
- "v5.*"
paths:
- "api/**"
schedule:
- cron: '00 12 * * *'
jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest
permissions:
actions: read
contents: read
security-events: write
strategy:
fail-fast: false
matrix:
language: [ 'python' ]
# Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support
steps:
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/api-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
with:
category: "/language:${{matrix.language}}"

View File

@@ -1,189 +0,0 @@
name: API - Pull Request
on:
push:
branches:
- "master"
- "v5.*"
paths:
- ".github/workflows/api-pull-request.yml"
- "api/**"
pull_request:
branches:
- "master"
- "v5.*"
paths:
- "api/**"
env:
POSTGRES_HOST: localhost
POSTGRES_PORT: 5432
POSTGRES_ADMIN_USER: prowler
POSTGRES_ADMIN_PASSWORD: S3cret
POSTGRES_USER: prowler_user
POSTGRES_PASSWORD: prowler
POSTGRES_DB: postgres-db
VALKEY_HOST: localhost
VALKEY_PORT: 6379
VALKEY_DB: 0
API_WORKING_DIR: ./api
IMAGE_NAME: prowler-api
jobs:
test:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.12"]
# Service containers to run with `test`
services:
# Label used to access the service container
postgres:
image: postgres
env:
POSTGRES_HOST: ${{ env.POSTGRES_HOST }}
POSTGRES_PORT: ${{ env.POSTGRES_PORT }}
POSTGRES_USER: ${{ env.POSTGRES_USER }}
POSTGRES_PASSWORD: ${{ env.POSTGRES_PASSWORD }}
POSTGRES_DB: ${{ env.POSTGRES_DB }}
# Set health checks to wait until postgres has started
ports:
- 5432:5432
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
valkey:
image: valkey/valkey:7-alpine3.19
env:
VALKEY_HOST: ${{ env.VALKEY_HOST }}
VALKEY_PORT: ${{ env.VALKEY_PORT }}
VALKEY_DB: ${{ env.VALKEY_DB }}
# Set health checks to wait until postgres has started
ports:
- 6379:6379
options: >-
--health-cmd "valkey-cli ping"
--health-interval 10s
--health-timeout 5s
--health-retries 5
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Test if changes are in not ignored paths
id: are-non-ignored-files-changed
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: api/**
files_ignore: |
api/.github/**
api/docs/**
api/permissions/**
api/README.md
api/mkdocs.yml
- name: Install poetry
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
python -m pip install --upgrade pip
pipx install poetry==2.1.1
- name: Set up Python ${{ matrix.python-version }}
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
with:
python-version: ${{ matrix.python-version }}
cache: "poetry"
- name: Install dependencies
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry install --no-root
poetry run pip list
VERSION=$(curl --silent "https://api.github.com/repos/hadolint/hadolint/releases/latest" | \
grep '"tag_name":' | \
sed -E 's/.*"v([^"]+)".*/\1/' \
) && curl -L -o /tmp/hadolint "https://github.com/hadolint/hadolint/releases/download/v${VERSION}/hadolint-Linux-x86_64" \
&& chmod +x /tmp/hadolint
- name: Poetry check
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry check --lock
- name: Lint with ruff
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run ruff check . --exclude contrib
- name: Check Format with ruff
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run ruff format --check . --exclude contrib
- name: Lint with pylint
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run pylint --disable=W,C,R,E -j 0 -rn -sn src/
- name: Bandit
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run bandit -q -lll -x '*_test.py,./contrib/' -r .
- name: Safety
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run safety check --ignore 70612,66963
- name: Vulture
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run vulture --exclude "contrib,tests,conftest.py" --min-confidence 100 .
- name: Hadolint
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
/tmp/hadolint Dockerfile --ignore=DL3013
- name: Test with pytest
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run pytest --cov=./src/backend --cov-report=xml src/backend
- name: Upload coverage reports to Codecov
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: codecov/codecov-action@ad3126e916f78f00edff4ed0317cf185271ccc2d # v5.4.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
flags: api
test-container-build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Build Container
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
context: ${{ env.API_WORKING_DIR }}
push: false
tags: ${{ env.IMAGE_NAME }}:latest
outputs: type=docker
cache-from: type=gha
cache-to: type=gha,mode=max

View File

@@ -1,47 +0,0 @@
name: Prowler - Automatic Backport
on:
pull_request_target:
branches: ['master']
types: ['labeled', 'closed']
env:
# The prefix of the label that triggers the backport must not contain the branch name
# so, for example, if the branch is 'master', the label should be 'backport-to-<branch>'
BACKPORT_LABEL_PREFIX: backport-to-
BACKPORT_LABEL_IGNORE: was-backported
jobs:
backport:
name: Backport PR
if: github.event.pull_request.merged == true && !(contains(github.event.pull_request.labels.*.name, 'backport')) && !(contains(github.event.pull_request.labels.*.name, 'was-backported'))
runs-on: ubuntu-latest
permissions:
id-token: write
pull-requests: write
contents: write
steps:
- name: Check labels
id: preview_label_check
uses: agilepathway/label-checker@c3d16ad512e7cea5961df85ff2486bb774caf3c5 # v1.6.65
with:
allow_failure: true
prefix_mode: true
any_of: ${{ env.BACKPORT_LABEL_PREFIX }}
none_of: ${{ env.BACKPORT_LABEL_IGNORE }}
repo_token: ${{ secrets.GITHUB_TOKEN }}
- name: Backport Action
if: steps.preview_label_check.outputs.label_check == 'success'
uses: sorenlouv/backport-github-action@ad888e978060bc1b2798690dd9d03c4036560947 # v9.5.1
with:
github_token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
auto_backport_label_prefix: ${{ env.BACKPORT_LABEL_PREFIX }}
- name: Info log
if: ${{ success() && steps.preview_label_check.outputs.label_check == 'success' }}
run: cat ~/.backport/backport.info.log
- name: Debug log
if: ${{ failure() && steps.preview_label_check.outputs.label_check == 'success' }}
run: cat ~/.backport/backport.debug.log

View File

@@ -1,24 +0,0 @@
name: Prowler - Pull Request Documentation Link
on:
pull_request:
branches:
- 'master'
- 'v3'
paths:
- 'docs/**'
env:
PR_NUMBER: ${{ github.event.pull_request.number }}
jobs:
documentation-link:
name: Documentation Link
runs-on: ubuntu-latest
steps:
- name: Leave PR comment with the Prowler Documentation URI
uses: peter-evans/create-or-update-comment@71345be0265236311c031f5c7866368bd1eff043 # v4.0.0
with:
issue-number: ${{ env.PR_NUMBER }}
body: |
You can check the documentation for this PR here -> [Prowler Documentation](https://prowler-prowler-docs--${{ env.PR_NUMBER }}.com.readthedocs.build/projects/prowler-open-source/en/${{ env.PR_NUMBER }}/)

View File

@@ -0,0 +1,209 @@
name: build-lint-push-containers
on:
push:
branches:
- 'master'
paths-ignore:
- '.github/**'
- 'README.md'
release:
types: [published, edited]
env:
AWS_REGION_STG: eu-west-1
AWS_REGION_PLATFORM: eu-west-1
AWS_REGION_PRO: us-east-1
IMAGE_NAME: prowler
LATEST_TAG: latest
STABLE_TAG: stable
TEMPORARY_TAG: temporary
DOCKERFILE_PATH: ./Dockerfile
jobs:
# Lint Dockerfile using Hadolint
# dockerfile-linter:
# runs-on: ubuntu-latest
# steps:
# -
# name: Checkout
# uses: actions/checkout@v3
# -
# name: Install Hadolint
# run: |
# VERSION=$(curl --silent "https://api.github.com/repos/hadolint/hadolint/releases/latest" | \
# grep '"tag_name":' | \
# sed -E 's/.*"v([^"]+)".*/\1/' \
# ) && curl -L -o /tmp/hadolint https://github.com/hadolint/hadolint/releases/download/v${VERSION}/hadolint-Linux-x86_64 \
# && chmod +x /tmp/hadolint
# -
# name: Run Hadolint
# run: |
# /tmp/hadolint util/Dockerfile
# Build Prowler OSS container
container-build:
# needs: dockerfile-linter
runs-on: ubuntu-latest
steps:
-
name: Checkout
uses: actions/checkout@v3
-
name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
-
name: Build
uses: docker/build-push-action@v2
with:
# Without pushing to registries
push: false
tags: ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }}
file: ${{ env.DOCKERFILE_PATH }}
outputs: type=docker,dest=/tmp/${{ env.IMAGE_NAME }}.tar
-
name: Share image between jobs
uses: actions/upload-artifact@v2
with:
name: ${{ env.IMAGE_NAME }}.tar
path: /tmp/${{ env.IMAGE_NAME }}.tar
# Lint Prowler OSS container using Dockle
# container-linter:
# needs: container-build
# runs-on: ubuntu-latest
# steps:
# -
# name: Get container image from shared
# uses: actions/download-artifact@v2
# with:
# name: ${{ env.IMAGE_NAME }}.tar
# path: /tmp
# -
# name: Load Docker image
# run: |
# docker load --input /tmp/${{ env.IMAGE_NAME }}.tar
# docker image ls -a
# -
# name: Install Dockle
# run: |
# VERSION=$(curl --silent "https://api.github.com/repos/goodwithtech/dockle/releases/latest" | \
# grep '"tag_name":' | \
# sed -E 's/.*"v([^"]+)".*/\1/' \
# ) && curl -L -o dockle.deb https://github.com/goodwithtech/dockle/releases/download/v${VERSION}/dockle_${VERSION}_Linux-64bit.deb \
# && sudo dpkg -i dockle.deb && rm dockle.deb
# -
# name: Run Dockle
# run: dockle ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }}
# Push Prowler OSS container to registries
container-push:
# needs: container-linter
needs: container-build
runs-on: ubuntu-latest
permissions:
id-token: write
contents: read # This is required for actions/checkout
steps:
-
name: Get container image from shared
uses: actions/download-artifact@v2
with:
name: ${{ env.IMAGE_NAME }}.tar
path: /tmp
-
name: Load Docker image
run: |
docker load --input /tmp/${{ env.IMAGE_NAME }}.tar
docker image ls -a
-
name: Login to DockerHub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
-
name: Login to Public ECR
uses: docker/login-action@v2
with:
registry: public.ecr.aws
username: ${{ secrets.PUBLIC_ECR_AWS_ACCESS_KEY_ID }}
password: ${{ secrets.PUBLIC_ECR_AWS_SECRET_ACCESS_KEY }}
env:
AWS_REGION: ${{ env.AWS_REGION_PRO }}
-
name: Configure AWS Credentials -- STG
if: github.event_name == 'push'
uses: aws-actions/configure-aws-credentials@v1
with:
aws-region: ${{ env.AWS_REGION_STG }}
role-to-assume: ${{ secrets.STG_IAM_ROLE_ARN }}
role-session-name: build-lint-containers-stg
-
name: Login to ECR -- STG
if: github.event_name == 'push'
uses: docker/login-action@v2
with:
registry: ${{ secrets.STG_ECR }}
-
name: Configure AWS Credentials -- PLATFORM
if: github.event_name == 'release'
uses: aws-actions/configure-aws-credentials@v1
with:
aws-region: ${{ env.AWS_REGION_PLATFORM }}
role-to-assume: ${{ secrets.STG_IAM_ROLE_ARN }}
role-session-name: build-lint-containers-pro
-
name: Login to ECR -- PLATFORM
if: github.event_name == 'release'
uses: docker/login-action@v2
with:
registry: ${{ secrets.PLATFORM_ECR }}
-
# Push to master branch - push "latest" tag
name: Tag (latest)
if: github.event_name == 'push'
run: |
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.PLATFORM_ECR }}/${{ secrets.PLATFORM_ECR_REPOSITORY }}:${{ env.LATEST_TAG }}
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.LATEST_TAG }}
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.LATEST_TAG }}
-
# Push to master branch - push "latest" tag
name: Push (latest)
if: github.event_name == 'push'
run: |
docker push ${{ secrets.PLATFORM_ECR }}/${{ secrets.PLATFORM_ECR_REPOSITORY }}:${{ env.LATEST_TAG }}
docker push ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.LATEST_TAG }}
docker push ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.LATEST_TAG }}
-
# Tag the new release (stable and release tag)
name: Tag (release)
if: github.event_name == 'release'
run: |
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.PLATFORM_ECR }}/${{ secrets.PLATFORM_ECR_REPOSITORY }}:${{ github.event.release.tag_name }}
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.PLATFORM_ECR }}/${{ secrets.PLATFORM_ECR_REPOSITORY }}:${{ env.STABLE_TAG }}
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
-
# Push the new release (stable and release tag)
name: Push (release)
if: github.event_name == 'release'
run: |
docker push ${{ secrets.PLATFORM_ECR }}/${{ secrets.PLATFORM_ECR_REPOSITORY }}:${{ github.event.release.tag_name }}
docker push ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
docker push ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
docker push ${{ secrets.PLATFORM_ECR }}/${{ secrets.PLATFORM_ECR_REPOSITORY }}:${{ env.STABLE_TAG }}
docker push ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
docker push ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
-
name: Delete artifacts
if: always()
uses: geekyeggo/delete-artifact@v1
with:
name: ${{ env.IMAGE_NAME }}.tar

View File

@@ -1,23 +0,0 @@
name: Prowler - Conventional Commit
on:
pull_request:
types:
- "opened"
- "edited"
- "synchronize"
branches:
- "master"
- "v3"
- "v4.*"
- "v5.*"
jobs:
conventional-commit-check:
runs-on: ubuntu-latest
steps:
- name: conventional-commit-check
id: conventional-commit-check
uses: agenthunt/conventional-commit-checker-action@9e552d650d0e205553ec7792d447929fc78e012b # v2.0.0
with:
pr-title-regex: '^([^\s(]+)(?:\(([^)]+)\))?: (.+)'

View File

@@ -1,4 +1,4 @@
name: Prowler - Find secrets
name: find-secrets
on: pull_request
@@ -7,13 +7,12 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v3
with:
fetch-depth: 0
- name: TruffleHog OSS
uses: trufflesecurity/trufflehog@690e5c7aff8347c3885096f3962a0633d9129607 # v3.88.23
uses: trufflesecurity/trufflehog@v3.13.0
with:
path: ./
base: ${{ github.event.repository.default_branch }}
head: HEAD
extra_args: --only-verified

View File

@@ -1,17 +0,0 @@
name: Prowler - PR Labeler
on:
pull_request_target:
branches:
- "master"
- "v3"
- "v4.*"
jobs:
labeler:
permissions:
contents: read
pull-requests: write
runs-on: ubuntu-latest
steps:
- uses: actions/labeler@8558fd74291d67161a8a78ce36a881fa63b766a9 # v5.0.0

View File

@@ -1,34 +0,0 @@
name: Prowler - Merged Pull Request
on:
pull_request_target:
branches: ['master']
types: ['closed']
jobs:
trigger-cloud-pull-request:
name: Trigger Cloud Pull Request
if: github.event.pull_request.merged == true && github.repository == 'prowler-cloud/prowler'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set short git commit SHA
id: vars
run: |
shortSha=$(git rev-parse --short ${{ github.sha }})
echo "SHORT_SHA=${shortSha}" >> $GITHUB_ENV
- name: Trigger pull request
uses: peter-evans/repository-dispatch@ff45666b9427631e3450c54a1bcbee4d9ff4d7c0 # v3.0.0
with:
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
repository: ${{ secrets.CLOUD_DISPATCH }}
event-type: prowler-pull-request-merged
client-payload: '{
"PROWLER_COMMIT_SHA": "${{ github.sha }}",
"PROWLER_COMMIT_SHORT_SHA": "${{ env.SHORT_SHA }}",
"PROWLER_PR_TITLE": "${{ github.event.pull_request.title }}",
"PROWLER_PR_LABELS": ${{ toJson(github.event.pull_request.labels.*.name) }},
"PROWLER_PR_BODY": ${{ toJson(github.event.pull_request.body) }}
}'

41
.github/workflows/pull-request.yml vendored Normal file
View File

@@ -0,0 +1,41 @@
name: Lint & Test
on:
push:
branches:
- 'prowler-3.0-dev'
pull_request:
branches:
- 'prowler-3.0-dev'
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.9"]
steps:
- uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install pipenv
pipenv install
- name: Bandit
run: |
pipenv run bandit -q -lll -x '*_test.py,./contrib/' -r .
- name: Safety
run: |
pipenv run safety check
- name: Vulture
run: |
pipenv run vulture --exclude "contrib" --min-confidence 100 .
- name: Test with pytest
run: |
pipenv run pytest -n auto

View File

@@ -0,0 +1,50 @@
# This is a basic workflow to help you get started with Actions
name: Refresh regions of AWS services
on:
schedule:
- cron: "0 9 * * *" #runs at 09:00 UTC everyday
env:
GITHUB_BRANCH: "prowler-3.0-dev"
# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
# This workflow contains a single job called "build"
build:
# The type of runner that the job will run on
runs-on: ubuntu-latest
# Steps represent a sequence of tasks that will be executed as part of the job
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout@v3
with:
ref: ${{ env.GITHUB_BRANCH }}
- name: setup python
uses: actions/setup-python@v2
with:
python-version: 3.9 #install the python needed
# Runs a single command using the runners shell
- name: Run a one-line script
run: python3 util/update_aws_services_regions.py
# Create pull request
- name: Create Pull Request
uses: peter-evans/create-pull-request@v4
with:
token: ${{ secrets.GITHUB_TOKEN }}
commit-message: "feat(regions_update): Update regions for AWS services."
branch: "aws-services-regions-updated"
labels: "status/waiting-for-revision, severity/low"
title: "feat(regions_update): Changes in regions for AWS services."
body: |
### Description
This PR updates the regions for AWS services.
### License
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

View File

@@ -1,186 +0,0 @@
name: SDK - Build and Push containers
on:
push:
branches:
# For `v3-latest`
- "v3"
# For `v4-latest`
- "v4.6"
# For `latest`
- "master"
paths-ignore:
- ".github/**"
- "README.md"
- "docs/**"
- "ui/**"
- "api/**"
release:
types: [published]
env:
# AWS Configuration
AWS_REGION_STG: eu-west-1
AWS_REGION_PLATFORM: eu-west-1
AWS_REGION: us-east-1
# Container's configuration
IMAGE_NAME: prowler
DOCKERFILE_PATH: ./Dockerfile
# Tags
LATEST_TAG: latest
STABLE_TAG: stable
# The RELEASE_TAG is set during runtime in releases
RELEASE_TAG: ""
# The PROWLER_VERSION and PROWLER_VERSION_MAJOR are set during runtime in releases
PROWLER_VERSION: ""
PROWLER_VERSION_MAJOR: ""
# TEMPORARY_TAG: temporary
# Python configuration
PYTHON_VERSION: 3.12
# Container Registries
PROWLERCLOUD_DOCKERHUB_REPOSITORY: prowlercloud
PROWLERCLOUD_DOCKERHUB_IMAGE: prowler
jobs:
# Build Prowler OSS container
container-build-push:
# needs: dockerfile-linter
runs-on: ubuntu-latest
outputs:
prowler_version_major: ${{ steps.get-prowler-version.outputs.PROWLER_VERSION_MAJOR }}
prowler_version: ${{ steps.get-prowler-version.outputs.PROWLER_VERSION }}
env:
POETRY_VIRTUALENVS_CREATE: "false"
steps:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Setup Python
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
with:
python-version: ${{ env.PYTHON_VERSION }}
- name: Install Poetry
run: |
pipx install poetry==2.*
pipx inject poetry poetry-bumpversion
- name: Get Prowler version
id: get-prowler-version
run: |
PROWLER_VERSION="$(poetry version -s 2>/dev/null)"
echo "PROWLER_VERSION=${PROWLER_VERSION}" >> "${GITHUB_ENV}"
echo "PROWLER_VERSION=${PROWLER_VERSION}" >> "${GITHUB_OUTPUT}"
# Store prowler version major just for the release
PROWLER_VERSION_MAJOR="${PROWLER_VERSION%%.*}"
echo "PROWLER_VERSION_MAJOR=${PROWLER_VERSION_MAJOR}" >> "${GITHUB_ENV}"
echo "PROWLER_VERSION_MAJOR=${PROWLER_VERSION_MAJOR}" >> "${GITHUB_OUTPUT}"
case ${PROWLER_VERSION_MAJOR} in
3)
echo "LATEST_TAG=v3-latest" >> "${GITHUB_ENV}"
echo "STABLE_TAG=v3-stable" >> "${GITHUB_ENV}"
;;
4)
echo "LATEST_TAG=v4-latest" >> "${GITHUB_ENV}"
echo "STABLE_TAG=v4-stable" >> "${GITHUB_ENV}"
;;
5)
echo "LATEST_TAG=latest" >> "${GITHUB_ENV}"
echo "STABLE_TAG=stable" >> "${GITHUB_ENV}"
;;
*)
# Fallback if any other version is present
echo "Releasing another Prowler major version, aborting..."
exit 1
;;
esac
- name: Login to DockerHub
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to Public ECR
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
with:
registry: public.ecr.aws
username: ${{ secrets.PUBLIC_ECR_AWS_ACCESS_KEY_ID }}
password: ${{ secrets.PUBLIC_ECR_AWS_SECRET_ACCESS_KEY }}
env:
AWS_REGION: ${{ env.AWS_REGION }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Build and push container image (latest)
if: github.event_name == 'push'
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
push: true
tags: |
${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.LATEST_TAG }}
${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.LATEST_TAG }}
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ env.LATEST_TAG }}
file: ${{ env.DOCKERFILE_PATH }}
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
# Use local context to get changes
# https://github.com/docker/build-push-action#path-context
context: .
push: true
tags: |
${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.PROWLER_VERSION }}
${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.PROWLER_VERSION }}
${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ env.PROWLER_VERSION }}
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ env.STABLE_TAG }}
file: ${{ env.DOCKERFILE_PATH }}
cache-from: type=gha
cache-to: type=gha,mode=max
dispatch-action:
needs: container-build-push
runs-on: ubuntu-latest
steps:
- name: Get latest commit info (latest)
if: github.event_name == 'push'
run: |
LATEST_COMMIT_HASH=$(echo ${{ github.event.after }} | cut -b -7)
echo "LATEST_COMMIT_HASH=${LATEST_COMMIT_HASH}" >> $GITHUB_ENV
- name: Dispatch event (latest)
if: github.event_name == 'push' && needs.container-build-push.outputs.prowler_version_major == '3'
run: |
curl https://api.github.com/repos/${{ secrets.DISPATCH_OWNER }}/${{ secrets.DISPATCH_REPO }}/dispatches \
-H "Accept: application/vnd.github+json" \
-H "Authorization: Bearer ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}" \
-H "X-GitHub-Api-Version: 2022-11-28" \
--data '{"event_type":"dispatch","client_payload":{"version":"v3-latest", "tag": "${{ env.LATEST_COMMIT_HASH }}"}}'
- name: Dispatch event (release)
if: github.event_name == 'release' && needs.container-build-push.outputs.prowler_version_major == '3'
run: |
curl https://api.github.com/repos/${{ secrets.DISPATCH_OWNER }}/${{ secrets.DISPATCH_REPO }}/dispatches \
-H "Accept: application/vnd.github+json" \
-H "Authorization: Bearer ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}" \
-H "X-GitHub-Api-Version: 2022-11-28" \
--data '{"event_type":"dispatch","client_payload":{"version":"release", "tag":"${{ needs.container-build-push.outputs.prowler_version }}"}}'

View File

@@ -1,67 +0,0 @@
# For most projects, this workflow file will not need changing; you simply need
# to commit it to your repository.
#
# You may wish to alter this file to override the set of languages analyzed,
# or to provide custom queries or build logic.
#
# ******** NOTE ********
# We have attempted to detect the languages in your repository. Please check
# the `language` matrix defined below to confirm you have the correct set of
# supported CodeQL languages.
#
name: SDK - CodeQL
on:
push:
branches:
- "master"
- "v3"
- "v4.*"
- "v5.*"
paths-ignore:
- 'ui/**'
- 'api/**'
- '.github/**'
pull_request:
branches:
- "master"
- "v3"
- "v4.*"
- "v5.*"
paths-ignore:
- 'ui/**'
- 'api/**'
- '.github/**'
schedule:
- cron: '00 12 * * *'
jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest
permissions:
actions: read
contents: read
security-events: write
strategy:
fail-fast: false
matrix:
language: [ 'python' ]
# Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support
steps:
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/sdk-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
with:
category: "/language:${{matrix.language}}"

View File

@@ -1,121 +0,0 @@
name: SDK - Pull Request
on:
push:
branches:
- "master"
- "v3"
- "v4.*"
- "v5.*"
pull_request:
branches:
- "master"
- "v3"
- "v4.*"
- "v5.*"
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.9", "3.10", "3.11", "3.12"]
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Test if changes are in not ignored paths
id: are-non-ignored-files-changed
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: ./**
files_ignore: |
.github/**
docs/**
permissions/**
api/**
ui/**
prowler/CHANGELOG.md
README.md
mkdocs.yml
.backportrc.json
.env
docker-compose*
examples/**
.gitignore
- name: Install poetry
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
python -m pip install --upgrade pip
pipx install poetry==2.1.1
- name: Set up Python ${{ matrix.python-version }}
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
with:
python-version: ${{ matrix.python-version }}
cache: "poetry"
- name: Install dependencies
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry install --no-root
poetry run pip list
VERSION=$(curl --silent "https://api.github.com/repos/hadolint/hadolint/releases/latest" | \
grep '"tag_name":' | \
sed -E 's/.*"v([^"]+)".*/\1/' \
) && curl -L -o /tmp/hadolint "https://github.com/hadolint/hadolint/releases/download/v${VERSION}/hadolint-Linux-x86_64" \
&& chmod +x /tmp/hadolint
- name: Poetry check
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry check --lock
- name: Lint with flake8
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run flake8 . --ignore=E266,W503,E203,E501,W605,E128 --exclude contrib,ui,api
- name: Checking format with black
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run black --exclude api ui --check .
- name: Lint with pylint
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run pylint --disable=W,C,R,E -j 0 -rn -sn prowler/
- name: Bandit
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run bandit -q -lll -x '*_test.py,./contrib/,./api/,./ui' -r .
- name: Safety
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run safety check --ignore 70612 -r pyproject.toml
- name: Vulture
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run vulture --exclude "contrib,api,ui" --min-confidence 100 .
- name: Hadolint
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
/tmp/hadolint Dockerfile --ignore=DL3013
- name: Test with pytest
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler --cov-report=xml tests
- name: Upload coverage reports to Codecov
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: codecov/codecov-action@ad3126e916f78f00edff4ed0317cf185271ccc2d # v5.4.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
flags: prowler

View File

@@ -1,98 +0,0 @@
name: SDK - PyPI release
on:
release:
types: [published]
env:
RELEASE_TAG: ${{ github.event.release.tag_name }}
PYTHON_VERSION: 3.11
# CACHE: "poetry"
jobs:
repository-check:
name: Repository check
runs-on: ubuntu-latest
outputs:
is_repo: ${{ steps.repository_check.outputs.is_repo }}
steps:
- name: Repository check
id: repository_check
working-directory: /tmp
run: |
if [[ ${{ github.repository }} == "prowler-cloud/prowler" ]]
then
echo "is_repo=true" >> "${GITHUB_OUTPUT}"
else
echo "This action only runs for prowler-cloud/prowler"
echo "is_repo=false" >> "${GITHUB_OUTPUT}"
fi
release-prowler-job:
runs-on: ubuntu-latest
needs: repository-check
if: needs.repository-check.outputs.is_repo == 'true'
env:
POETRY_VIRTUALENVS_CREATE: "false"
name: Release Prowler to PyPI
steps:
- name: Repository check
working-directory: /tmp
run: |
if [[ "${{ github.repository }}" != "prowler-cloud/prowler" ]]; then
echo "This action only runs for prowler-cloud/prowler"
exit 1
fi
- name: Get Prowler version
run: |
PROWLER_VERSION="${{ env.RELEASE_TAG }}"
case ${PROWLER_VERSION%%.*} in
3)
echo "Releasing Prowler v3 with tag ${PROWLER_VERSION}"
;;
4)
echo "Releasing Prowler v4 with tag ${PROWLER_VERSION}"
;;
5)
echo "Releasing Prowler v5 with tag ${PROWLER_VERSION}"
;;
*)
echo "Releasing another Prowler major version, aborting..."
exit 1
;;
esac
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Install dependencies
run: |
pipx install poetry==2.1.1
- name: Setup Python
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
with:
python-version: ${{ env.PYTHON_VERSION }}
# cache: ${{ env.CACHE }}
- name: Build Prowler package
run: |
poetry build
- name: Publish Prowler package to PyPI
run: |
poetry config pypi-token.pypi ${{ secrets.PYPI_API_TOKEN }}
poetry publish
- name: Replicate PyPI package
run: |
rm -rf ./dist && rm -rf ./build && rm -rf prowler.egg-info
pip install toml
python util/replicate_pypi_package.py
poetry build
- name: Publish prowler-cloud package to PyPI
run: |
poetry config pypi-token.pypi ${{ secrets.PYPI_API_TOKEN }}
poetry publish

View File

@@ -1,68 +0,0 @@
# This is a basic workflow to help you get started with Actions
name: SDK - Refresh AWS services' regions
on:
schedule:
- cron: "0 9 * * 1" # runs at 09:00 UTC every Monday
env:
GITHUB_BRANCH: "master"
AWS_REGION_DEV: us-east-1
# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
# This workflow contains a single job called "build"
build:
# The type of runner that the job will run on
runs-on: ubuntu-latest
permissions:
id-token: write
pull-requests: write
contents: write
# Steps represent a sequence of tasks that will be executed as part of the job
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
ref: ${{ env.GITHUB_BRANCH }}
- name: setup python
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
with:
python-version: 3.9 #install the python needed
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install boto3
- name: Configure AWS Credentials -- DEV
uses: aws-actions/configure-aws-credentials@ececac1a45f3b08a01d2dd070d28d111c5fe6722 # v4.1.0
with:
aws-region: ${{ env.AWS_REGION_DEV }}
role-to-assume: ${{ secrets.DEV_IAM_ROLE_ARN }}
role-session-name: refresh-AWS-regions-dev
# Runs a single command using the runners shell
- name: Run a one-line script
run: python3 util/update_aws_services_regions.py
# Create pull request
- name: Create Pull Request
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
commit-message: "feat(regions_update): Update regions for AWS services"
branch: "aws-services-regions-updated-${{ github.sha }}"
labels: "status/waiting-for-revision, severity/low, provider/aws"
title: "chore(regions_update): Changes in regions for AWS services"
body: |
### Description
This PR updates the regions for AWS services.
### License
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

View File

@@ -1,118 +0,0 @@
name: UI - Build and Push containers
on:
push:
branches:
- "master"
paths:
- "ui/**"
- ".github/workflows/ui-build-lint-push-containers.yml"
# Uncomment the below code to test this action on PRs
# pull_request:
# branches:
# - "master"
# paths:
# - "ui/**"
# - ".github/workflows/ui-build-lint-push-containers.yml"
release:
types: [published]
env:
# Tags
LATEST_TAG: latest
RELEASE_TAG: ${{ github.event.release.tag_name }}
STABLE_TAG: stable
WORKING_DIRECTORY: ./ui
# Container Registries
PROWLERCLOUD_DOCKERHUB_REPOSITORY: prowlercloud
PROWLERCLOUD_DOCKERHUB_IMAGE: prowler-ui
jobs:
repository-check:
name: Repository check
runs-on: ubuntu-latest
outputs:
is_repo: ${{ steps.repository_check.outputs.is_repo }}
steps:
- name: Repository check
id: repository_check
working-directory: /tmp
run: |
if [[ ${{ github.repository }} == "prowler-cloud/prowler" ]]
then
echo "is_repo=true" >> "${GITHUB_OUTPUT}"
else
echo "This action only runs for prowler-cloud/prowler"
echo "is_repo=false" >> "${GITHUB_OUTPUT}"
fi
# Build Prowler OSS container
container-build-push:
needs: repository-check
if: needs.repository-check.outputs.is_repo == 'true'
runs-on: ubuntu-latest
defaults:
run:
working-directory: ${{ env.WORKING_DIRECTORY }}
steps:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set short git commit SHA
id: vars
run: |
shortSha=$(git rev-parse --short ${{ github.sha }})
echo "SHORT_SHA=${shortSha}" >> $GITHUB_ENV
- name: Login to DockerHub
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Build and push container image (latest)
# Comment the following line for testing
if: github.event_name == 'push'
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
context: ${{ env.WORKING_DIRECTORY }}
build-args: |
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=${{ env.SHORT_SHA }}
# Set push: false for testing
push: true
tags: |
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ env.LATEST_TAG }}
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ env.SHORT_SHA }}
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
context: ${{ env.WORKING_DIRECTORY }}
build-args: |
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${{ env.RELEASE_TAG }}
push: true
tags: |
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ env.RELEASE_TAG }}
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ env.STABLE_TAG }}
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Trigger deployment
if: github.event_name == 'push'
uses: peter-evans/repository-dispatch@ff45666b9427631e3450c54a1bcbee4d9ff4d7c0 # v3.0.0
with:
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
repository: ${{ secrets.CLOUD_DISPATCH }}
event-type: prowler-ui-deploy
client-payload: '{"sha": "${{ github.sha }}", "short_sha": "${{ env.SHORT_SHA }}"}'

View File

@@ -1,59 +0,0 @@
# For most projects, this workflow file will not need changing; you simply need
# to commit it to your repository.
#
# You may wish to alter this file to override the set of languages analyzed,
# or to provide custom queries or build logic.
#
# ******** NOTE ********
# We have attempted to detect the languages in your repository. Please check
# the `language` matrix defined below to confirm you have the correct set of
# supported CodeQL languages.
#
name: UI - CodeQL
on:
push:
branches:
- "master"
- "v5.*"
paths:
- "ui/**"
pull_request:
branches:
- "master"
- "v5.*"
paths:
- "ui/**"
schedule:
- cron: "00 12 * * *"
jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest
permissions:
actions: read
contents: read
security-events: write
strategy:
fail-fast: false
matrix:
language: ["javascript"]
# Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support
steps:
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/ui-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
with:
category: "/language:${{matrix.language}}"

View File

@@ -1,62 +0,0 @@
name: UI - Pull Request
on:
push:
branches:
- "master"
- "v5.*"
paths:
- ".github/workflows/ui-pull-request.yml"
- "ui/**"
pull_request:
branches:
- master
- "v5.*"
paths:
- 'ui/**'
env:
UI_WORKING_DIR: ./ui
IMAGE_NAME: prowler-ui
jobs:
test-and-coverage:
runs-on: ubuntu-latest
strategy:
matrix:
os: [ubuntu-latest]
node-version: [20.x]
steps:
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- name: Setup Node.js ${{ matrix.node-version }}
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version: ${{ matrix.node-version }}
- name: Install dependencies
working-directory: ./ui
run: npm install
- name: Run Healthcheck
working-directory: ./ui
run: npm run healthcheck
- name: Build the application
working-directory: ./ui
run: npm run build
test-container-build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Build Container
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
context: ${{ env.UI_WORKING_DIR }}
# Always build using `prod` target
target: prod
push: false
tags: ${{ env.IMAGE_NAME }}:latest
outputs: type=docker
build-args: |
NEXT_PUBLIC_STRIPE_PUBLISHABLE_KEY=pk_test_51LwpXXXX

32
.gitignore vendored
View File

@@ -5,15 +5,6 @@
[._]ss[a-gi-z]
[._]sw[a-p]
# Python code
__pycache__
venv/
build/
/dist/
*.egg-info/
*/__pycache__/*.pyc
.idea/
# Session
Session.vim
Sessionx.vim
@@ -31,7 +22,7 @@ tags
*.DS_Store
# Prowler output
/output
output/
# Prowler found secrets
secrets-*/
@@ -42,23 +33,8 @@ junit-reports/
# VSCode files
.vscode/
# Terraform
.terraform*
*.tfstate
*.tfstate.*
terraform-kickstarter/.terraform.lock.hcl
# .env
ui/.env*
api/.env*
.env.local
terraform-kickstarter/.terraform/providers/registry.terraform.io/hashicorp/aws/3.56.0/darwin_amd64/terraform-provider-aws_v3.56.0_x5
# Coverage
.coverage*
.coverage
coverage*
# Node
node_modules
# Persistent data
_data/
terraform-kickstarter/terraform.tfstate

View File

@@ -1,126 +1,29 @@
repos:
## GENERAL
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.6.0
rev: v3.3.0
hooks:
- id: check-merge-conflict
- id: check-yaml
args: ["--unsafe"]
args: ['--unsafe']
- id: check-json
- id: end-of-file-fixer
- id: trailing-whitespace
exclude: 'README.md'
- id: no-commit-to-branch
- id: pretty-format-json
args: ["--autofix", --no-sort-keys, --no-ensure-ascii]
args: ['--autofix']
## TOML
- repo: https://github.com/macisamuele/language-formatters-pre-commit-hooks
rev: v2.13.0
hooks:
- id: pretty-format-toml
args: [--autofix]
files: pyproject.toml
## BASH
- repo: https://github.com/koalaman/shellcheck-precommit
rev: v0.10.0
rev: v0.8.0
hooks:
- id: shellcheck
exclude: contrib
## PYTHON
- repo: https://github.com/myint/autoflake
rev: v2.3.1
hooks:
- id: autoflake
args:
[
"--in-place",
"--remove-all-unused-imports",
"--remove-unused-variable",
]
- repo: https://github.com/timothycrosley/isort
rev: 5.13.2
hooks:
- id: isort
args: ["--profile", "black"]
- repo: https://github.com/psf/black
rev: 24.4.2
hooks:
- id: black
- repo: https://github.com/pycqa/flake8
rev: 7.0.0
hooks:
- id: flake8
exclude: contrib
args: ["--ignore=E266,W503,E203,E501,W605"]
- repo: https://github.com/python-poetry/poetry
rev: 2.1.1
hooks:
- id: poetry-check
name: API - poetry-check
args: ["--directory=./api"]
pass_filenames: false
- id: poetry-lock
name: API - poetry-lock
args: ["--directory=./api"]
pass_filenames: false
- id: poetry-check
name: SDK - poetry-check
args: ["--directory=./"]
pass_filenames: false
- id: poetry-lock
name: SDK - poetry-lock
args: ["--directory=./"]
pass_filenames: false
- id: shellcheck
- repo: https://github.com/hadolint/hadolint
rev: v2.13.0-beta
rev: v2.10.0
hooks:
- id: hadolint
args: ["--ignore=DL3013"]
- repo: local
hooks:
- id: pylint
name: pylint
entry: bash -c 'pylint --disable=W,C,R,E -j 0 -rn -sn prowler/'
name: Lint Dockerfiles
description: Runs hadolint to lint Dockerfiles
language: system
files: '.*\.py'
- id: trufflehog
name: TruffleHog
description: Detect secrets in your data.
entry: bash -c 'trufflehog --no-update git file://. --only-verified --fail'
# For running trufflehog in docker, use the following entry instead:
# entry: bash -c 'docker run -v "$(pwd):/workdir" -i --rm trufflesecurity/trufflehog:latest git file:///workdir --only-verified --fail'
language: system
stages: ["pre-commit", "pre-push"]
- id: bandit
name: bandit
description: "Bandit is a tool for finding common security issues in Python code"
entry: bash -c 'bandit -q -lll -x '*_test.py,./contrib/,./.venv/' -r .'
language: system
files: '.*\.py'
- id: safety
name: safety
description: "Safety is a tool that checks your installed dependencies for known security vulnerabilities"
entry: bash -c 'safety check --ignore 70612,66963'
language: system
- id: vulture
name: vulture
description: "Vulture finds unused code in Python programs."
entry: bash -c 'vulture --exclude "contrib,.venv,api/src/backend/api/tests/,api/src/backend/conftest.py,api/src/backend/tasks/tests/" --min-confidence 100 .'
language: system
files: '.*\.py'
types: ["dockerfile"]
entry: hadolint

View File

@@ -1,25 +0,0 @@
# .readthedocs.yaml
# Read the Docs configuration file
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
# Required
version: 2
build:
os: "ubuntu-22.04"
tools:
python: "3.11"
jobs:
post_create_environment:
# Install poetry
# https://python-poetry.org/docs/#installing-manually
- python -m pip install poetry
post_install:
# Install dependencies with 'docs' dependency group
# https://python-poetry.org/docs/managing-dependencies/#dependency-groups
# VIRTUAL_ENV needs to be set manually for now.
# See https://github.com/readthedocs/readthedocs.org/pull/11152/
- VIRTUAL_ENV=${READTHEDOCS_VIRTUALENV_PATH} python -m poetry install --only=docs
mkdocs:
configuration: mkdocs.yml

View File

@@ -55,7 +55,7 @@ further defined and clarified by project maintainers.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported by contacting the project team at [support.prowler.com](https://customer.support.prowler.com/servicedesk/customer/portals). All
reported by contacting the project team at community@prowler.cloud. All
complaints will be reviewed and investigated and will result in a response that
is deemed necessary and appropriate to the circumstances. The project team is
obligated to maintain confidentiality with regard to the reporter of an incident.

View File

@@ -1,13 +0,0 @@
# Do you want to learn on how to...
- Contribute with your code or fixes to Prowler
- Create a new check for a provider
- Create a new security compliance framework
- Add a custom output format
- Add a new integration
- Contribute with documentation
Want some swag as appreciation for your contribution?
# Prowler Developer Guide
https://docs.prowler.com/projects/prowler-open-source/en/latest/developer-guide/introduction/

View File

@@ -1,42 +1,64 @@
FROM python:3.12.10-alpine3.20
# Build command
# docker build --platform=linux/amd64 --no-cache -t prowler:latest -f ./Dockerfile .
# hadolint ignore=DL3007
FROM public.ecr.aws/amazonlinux/amazonlinux:latest
LABEL maintainer="https://github.com/prowler-cloud/prowler"
# Update system dependencies and install essential tools
#hadolint ignore=DL3018
RUN apk --no-cache upgrade && apk --no-cache add curl git gcc python3-dev musl-dev linux-headers
ARG USERNAME=prowler
ARG USERID=34000
# Prepare image as root
USER 0
# System dependencies
# hadolint ignore=DL3006,DL3013,DL3033
RUN yum upgrade -y && \
yum install -y python3 bash curl jq coreutils py3-pip which unzip shadow-utils && \
yum clean all && \
rm -rf /var/cache/yum
RUN amazon-linux-extras install -y epel postgresql14 && \
yum clean all && \
rm -rf /var/cache/yum
# Create non-root user
RUN mkdir -p /home/prowler && \
echo 'prowler:x:1000:1000:prowler:/home/prowler:' > /etc/passwd && \
echo 'prowler:x:1000:' > /etc/group && \
chown -R prowler:prowler /home/prowler
USER prowler
RUN useradd -l -s /bin/bash -U -u ${USERID} ${USERNAME}
# Copy necessary files
WORKDIR /home/prowler
COPY prowler/ /home/prowler/prowler/
COPY dashboard/ /home/prowler/dashboard/
COPY pyproject.toml /home/prowler
COPY README.md /home/prowler/
USER ${USERNAME}
# Install Python dependencies
ENV HOME='/home/prowler'
ENV PATH="${HOME}/.local/bin:${PATH}"
#hadolint ignore=DL3013
RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir poetry
# Python dependencies
# hadolint ignore=DL3006,DL3013,DL3042
RUN pip3 install --upgrade pip && \
pip3 install --no-cache-dir boto3 detect-secrets==1.0.3 && \
pip3 cache purge
# Set Python PATH
ENV PATH="/home/${USERNAME}/.local/bin:${PATH}"
# By default poetry does not compile Python source files to bytecode during installation.
# This speeds up the installation process, but the first execution may take a little more
# time because Python then compiles source files to bytecode automatically. If you want to
# compile source files to bytecode during installation, you can use the --compile option
RUN poetry install --compile && \
rm -rf ~/.cache/pip
USER 0
# Remove deprecated dash dependencies
RUN pip uninstall dash-html-components -y && \
pip uninstall dash-core-components -y
# Install AWS CLI
RUN curl https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip -o awscliv2.zip && \
unzip -q awscliv2.zip && \
aws/install && \
rm -rf aws awscliv2.zip
USER prowler
ENTRYPOINT ["poetry", "run", "prowler"]
# Keep Python2 for yum
RUN sed -i '1 s/python/python2.7/' /usr/bin/yum
# Set Python3
RUN rm /usr/bin/python && \
ln -s /usr/bin/python3 /usr/bin/python
# Set working directory
WORKDIR /prowler
# Copy all files
COPY . ./
# Set files ownership
RUN chown -R prowler .
USER ${USERNAME}
ENTRYPOINT ["./prowler"]

View File

@@ -186,7 +186,7 @@
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright @ 2024 Toni de la Fuente
Copyright 2018 Netflix, Inc.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
@@ -198,4 +198,4 @@
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
limitations under the License.

View File

@@ -0,0 +1,5 @@
```
./prowler -l # to see all available checks and their groups.
./prowler -L # to see all available groups only.
./prowler -l -g groupname # to see checks in a particular group
```

View File

@@ -1,47 +0,0 @@
.DEFAULT_GOAL:=help
##@ Testing
test: ## Test with pytest
rm -rf .coverage && \
pytest -n auto -vvv -s --cov=./prowler --cov-report=xml tests
coverage: ## Show Test Coverage
coverage run --skip-covered -m pytest -v && \
coverage report -m && \
rm -rf .coverage && \
coverage report -m
coverage-html: ## Show Test Coverage
rm -rf ./htmlcov && \
coverage html && \
open htmlcov/index.html
##@ Linting
format: ## Format Code
@echo "Running black..."
black .
lint: ## Lint Code
@echo "Running flake8..."
flake8 . --ignore=E266,W503,E203,E501,W605,E128 --exclude contrib
@echo "Running black... "
black --check .
@echo "Running pylint..."
pylint --disable=W,C,R,E -j 0 prowler util
##@ PyPI
pypi-clean: ## Delete the distribution files
rm -rf ./dist && rm -rf ./build && rm -rf prowler.egg-info
pypi-build: ## Build package
$(MAKE) pypi-clean && \
poetry build
pypi-upload: ## Upload package
python3 -m twine upload --repository pypi dist/*
##@ Help
help: ## Show this help.
@echo "Prowler Makefile"
@awk 'BEGIN {FS = ":.*##"; printf "\nUsage:\n make \033[36m<target>\033[0m\n"} /^[a-zA-Z_-]+:.*?##/ { printf " \033[36m%-15s\033[0m %s\n", $$1, $$2 } /^##@/ { printf "\n\033[1m%s\033[0m\n", substr($$0, 5) } ' $(MAKEFILE_LIST)

13
Pipfile Normal file
View File

@@ -0,0 +1,13 @@
[[source]]
name = "pypi"
url = "https://pypi.org/simple"
verify_ssl = true
[dev-packages]
[packages]
boto3 = ">=1.9.188"
detect-secrets = "==1.0.3"
[requires]
python_version = "3.7"

1174
README.md

File diff suppressed because it is too large Load Diff

View File

@@ -1,23 +0,0 @@
# Security Policy
## Software Security
As an **AWS Partner** and we have passed the [AWS Foundation Technical Review (FTR)](https://aws.amazon.com/partners/foundational-technical-review/) and we use the following tools and automation to make sure our code is secure and dependencies up-to-dated:
- `bandit` for code security review.
- `safety` and `dependabot` for dependencies.
- `hadolint` and `dockle` for our containers security.
- `snyk` in Docker Hub.
- `clair` in Amazon ECR.
- `vulture`, `flake8`, `black` and `pylint` for formatting and best practices.
## Reporting a Vulnerability
If you would like to report a vulnerability or have a security concern regarding Prowler Open Source or ProwlerPro service, please submit the information by contacting to https://support.prowler.com.
The information you share with ProwlerPro as part of this process is kept confidential within ProwlerPro. We will only share this information with a third party if the vulnerability you report is found to affect a third-party product, in which case we will share this information with the third-party product's author or manufacturer. Otherwise, we will only share this information as permitted by you.
We will review the submitted report, and assign it a tracking number. We will then respond to you, acknowledging receipt of the report, and outline the next steps in the process.
You will receive a non-automated response to your initial contact within 24 hours, confirming receipt of your reported vulnerability.
We will coordinate public notification of any validated vulnerability with you. Where possible, we prefer that our respective public disclosures be posted simultaneously.

29
allowlist_example.txt Normal file
View File

@@ -0,0 +1,29 @@
# Each line is a (checkid:item) tuple
# Example: Will not consider a myignoredbucket failures as full failure. (Still printed as a warning)
check26:myignoredbucket
# Note that by default, this searches for the string appearing *anywhere* in the resource name.
# For example:
# extra718:ci-logs # Will block bucket "ci-logs" AND ALSO bucket "ci-logs-replica"
# extra718:logs # Will block EVERY BUCKET containing the string "logs"
# line starting with # are ignored as comments
# add a line per resource as here:
#<checkid1>:<resource to ignore 1>
#<checkid1>:<resource to ignore 2>
# checkid2
#<checkid2>:<resource to ignore 1>
# REGEXES
# This allowlist works with regexes (ERE, the same style of regex as grep -E and bash's =~ use)
# therefore:
# extra718:[[:alnum:]]+-logs # will ignore all buckets containing the terms ci-logs, qa-logs, etc.
# EXAMPLE: CONTROL TOWER
# When using Control Tower, guardrails prevent access to certain protected resources. The allowlist
# below ensures that warnings instead of errors are reported for the affected resources.
#extra734:aws-controltower-logs-[[:digit:]]+-[[:alpha:]\-]+
#extra734:aws-controltower-s3-access-logs-[[:digit:]]+-[[:alpha:]\-]+
#extra764:aws-controltower-logs-[[:digit:]]+-[[:alpha:]\-]+
#extra764:aws-controltower-s3-access-logs-[[:digit:]]+-[[:alpha:]\-]+

View File

@@ -1,58 +0,0 @@
# Django settings
DJANGO_ALLOWED_HOSTS=localhost,127.0.0.1
DJANGO_BIND_ADDRESS=0.0.0.0
DJANGO_PORT=8000
DJANGO_DEBUG=False
# Select one of [production|devel]
DJANGO_SETTINGS_MODULE=config.django.[production|devel]
# Select one of [ndjson|human_readable]
DJANGO_LOGGING_FORMATTER=[ndjson|human_readable]
# Select one of [DEBUG|INFO|WARNING|ERROR|CRITICAL]
# Applies to both Django and Celery Workers
DJANGO_LOGGING_LEVEL=INFO
DJANGO_WORKERS=4 # Defaults to the maximum available based on CPU cores if not set.
DJANGO_TOKEN_SIGNING_KEY=""
DJANGO_TOKEN_VERIFYING_KEY=""
# Token lifetime is in minutes
DJANGO_ACCESS_TOKEN_LIFETIME=30
DJANGO_REFRESH_TOKEN_LIFETIME=1440
DJANGO_CACHE_MAX_AGE=3600
DJANGO_STALE_WHILE_REVALIDATE=60
DJANGO_SECRETS_ENCRYPTION_KEY=""
# Decide whether to allow Django manage database table partitions
DJANGO_MANAGE_DB_PARTITIONS=[True|False]
DJANGO_CELERY_DEADLOCK_ATTEMPTS=5
DJANGO_BROKER_VISIBILITY_TIMEOUT=86400
DJANGO_SENTRY_DSN=
# PostgreSQL settings
# If running django and celery on host, use 'localhost', else use 'postgres-db'
POSTGRES_HOST=[localhost|postgres-db]
POSTGRES_PORT=5432
POSTGRES_ADMIN_USER=prowler
POSTGRES_ADMIN_PASSWORD=S3cret
POSTGRES_USER=prowler_user
POSTGRES_PASSWORD=S3cret
POSTGRES_DB=prowler_db
# Valkey settings
# If running django and celery on host, use localhost, else use 'valkey'
VALKEY_HOST=[localhost|valkey]
VALKEY_PORT=6379
VALKEY_DB=0
# Sentry settings
SENTRY_ENVIRONMENT=local
SENTRY_RELEASE=local
# Social login credentials
DJANGO_GOOGLE_OAUTH_CLIENT_ID=""
DJANGO_GOOGLE_OAUTH_CLIENT_SECRET=""
DJANGO_GOOGLE_OAUTH_CALLBACK_URL=""
DJANGO_GITHUB_OAUTH_CLIENT_ID=""
DJANGO_GITHUB_OAUTH_CLIENT_SECRET=""
DJANGO_GITHUB_OAUTH_CALLBACK_URL=""
# Deletion Task Batch Size
DJANGO_DELETION_BATCH_SIZE=5000

168
api/.gitignore vendored
View File

@@ -1,168 +0,0 @@
# Byte-compiled / optimized / DLL files
__pycache__/
*.pyc
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
cover/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal
/_data/
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
.pybuilder/
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
# For a library or package, you might want to ignore these files since the code is
# intended to run in multiple environments; otherwise, check them in:
# .python-version
# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock
# poetry
# Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control.
# This is especially recommended for binary packages to ensure reproducibility, and is more
# commonly ignored for libraries.
# https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control
#poetry.lock
# pdm
# Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control.
#pdm.lock
# pdm stores project-wide configurations in .pdm.toml, but it is recommended to not include it
# in version control.
# https://pdm.fming.dev/latest/usage/project/#working-with-version-control
.pdm.toml
.pdm-python
.pdm-build/
# PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm
__pypackages__/
# Celery stuff
celerybeat-schedule
celerybeat.pid
# SageMath parsed files
*.sage.py
# Environments
.env
*.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/
# pytype static type analyzer
.pytype/
# Cython debug symbols
cython_debug/
# PyCharm
# JetBrains specific template is maintained in a separate JetBrains.gitignore that can
# be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore
# and can be added to the global gitignore or merged into this file. For a more nuclear
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
.idea/
# VSCode
.vscode/

View File

@@ -1,91 +0,0 @@
repos:
## GENERAL
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.6.0
hooks:
- id: check-merge-conflict
- id: check-yaml
args: ["--unsafe"]
- id: check-json
- id: end-of-file-fixer
- id: trailing-whitespace
- id: no-commit-to-branch
- id: pretty-format-json
args: ["--autofix", "--no-sort-keys", "--no-ensure-ascii"]
exclude: 'src/backend/api/fixtures/dev/.*\.json$'
## TOML
- repo: https://github.com/macisamuele/language-formatters-pre-commit-hooks
rev: v2.13.0
hooks:
- id: pretty-format-toml
args: [--autofix]
files: pyproject.toml
## BASH
- repo: https://github.com/koalaman/shellcheck-precommit
rev: v0.10.0
hooks:
- id: shellcheck
exclude: contrib
## PYTHON
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.5.0
hooks:
# Run the linter.
- id: ruff
args: [ --fix ]
# Run the formatter.
- id: ruff-format
- repo: https://github.com/python-poetry/poetry
rev: 1.8.0
hooks:
- id: poetry-check
args: ["--directory=src"]
- id: poetry-lock
args: ["--no-update", "--directory=src"]
- repo: https://github.com/hadolint/hadolint
rev: v2.13.0-beta
hooks:
- id: hadolint
args: ["--ignore=DL3013", "Dockerfile"]
- repo: local
hooks:
- id: pylint
name: pylint
entry: bash -c 'poetry run pylint --disable=W,C,R,E -j 0 -rn -sn src/'
language: system
files: '.*\.py'
- id: trufflehog
name: TruffleHog
description: Detect secrets in your data.
entry: bash -c 'trufflehog --no-update git file://. --only-verified --fail'
# For running trufflehog in docker, use the following entry instead:
# entry: bash -c 'docker run -v "$(pwd):/workdir" -i --rm trufflesecurity/trufflehog:latest git file:///workdir --only-verified --fail'
language: system
stages: ["commit", "push"]
- id: bandit
name: bandit
description: "Bandit is a tool for finding common security issues in Python code"
entry: bash -c 'poetry run bandit -q -lll -x '*_test.py,./contrib/,./.venv/' -r .'
language: system
files: '.*\.py'
- id: safety
name: safety
description: "Safety is a tool that checks your installed dependencies for known security vulnerabilities"
entry: bash -c 'poetry run safety check --ignore 70612,66963'
language: system
- id: vulture
name: vulture
description: "Vulture finds unused code in Python programs."
entry: bash -c 'poetry run vulture --exclude "contrib,.venv,tests,conftest.py" --min-confidence 100 .'
language: system
files: '.*\.py'

View File

@@ -1,78 +0,0 @@
# Prowler API Changelog
All notable changes to the **Prowler API** are documented in this file.
## [v1.7.0] (UNRELEASED)
### Added
- Added M365 as a new provider [(#7563)](https://github.com/prowler-cloud/prowler/pull/7563).
---
## [v1.6.0] (Prowler v5.5.0)
### Added
- Support for developing new integrations [(#7167)](https://github.com/prowler-cloud/prowler/pull/7167).
- HTTP Security Headers [(#7289)](https://github.com/prowler-cloud/prowler/pull/7289).
- New endpoint to get the compliance overviews metadata [(#7333)](https://github.com/prowler-cloud/prowler/pull/7333).
- Support for muted findings [(#7378)](https://github.com/prowler-cloud/prowler/pull/7378).
- Added missing fields to API findings and resources [(#7318)](https://github.com/prowler-cloud/prowler/pull/7318).
---
## [v1.5.4] (Prowler v5.4.4)
### Fixed
- Fixed a bug with periodic tasks when trying to delete a provider ([#7466])(https://github.com/prowler-cloud/prowler/pull/7466).
---
## [v1.5.3] (Prowler v5.4.3)
### Fixed
- Added duplicated scheduled scans handling ([#7401])(https://github.com/prowler-cloud/prowler/pull/7401).
- Added environment variable to configure the deletion task batch size ([#7423])(https://github.com/prowler-cloud/prowler/pull/7423).
---
## [v1.5.2] (Prowler v5.4.2)
### Changed
- Refactored deletion logic and implemented retry mechanism for deletion tasks [(#7349)](https://github.com/prowler-cloud/prowler/pull/7349).
---
## [v1.5.1] (Prowler v5.4.1)
### Fixed
- Added a handled response in case local files are missing [(#7183)](https://github.com/prowler-cloud/prowler/pull/7183).
- Fixed a race condition when deleting export files after the S3 upload [(#7172)](https://github.com/prowler-cloud/prowler/pull/7172).
- Handled exception when a provider has no secret in test connection [(#7283)](https://github.com/prowler-cloud/prowler/pull/7283).
---
## [v1.5.0] (Prowler v5.4.0)
### Added
- Social login integration with Google and GitHub [(#6906)](https://github.com/prowler-cloud/prowler/pull/6906)
- Add API scan report system, now all scans launched from the API will generate a compressed file with the report in OCSF, CSV and HTML formats [(#6878)](https://github.com/prowler-cloud/prowler/pull/6878).
- Configurable Sentry integration [(#6874)](https://github.com/prowler-cloud/prowler/pull/6874)
### Changed
- Optimized `GET /findings` endpoint to improve response time and size [(#7019)](https://github.com/prowler-cloud/prowler/pull/7019).
---
## [v1.4.0] (Prowler v5.3.0)
### Changed
- Daily scheduled scan instances are now created beforehand with `SCHEDULED` state [(#6700)](https://github.com/prowler-cloud/prowler/pull/6700).
- Findings endpoints now require at least one date filter [(#6800)](https://github.com/prowler-cloud/prowler/pull/6800).
- Findings metadata endpoint received a performance improvement [(#6863)](https://github.com/prowler-cloud/prowler/pull/6863).
- Increased the allowed length of the provider UID for Kubernetes providers [(#6869)](https://github.com/prowler-cloud/prowler/pull/6869).
---

View File

@@ -1,69 +0,0 @@
FROM python:3.12-slim-bookworm AS build
LABEL maintainer="https://github.com/prowler-cloud/api"
# hadolint ignore=DL3008
RUN apt-get update && apt-get install -y --no-install-recommends \
gcc python3-dev bash \
libcurl4-openssl-dev ca-certificates less ncurses-base \
krb5-multidev libgcc-s1 gettext libssl3 \
libstdc++6 tzdata liburcu8 zlib1g libicu72 curl openssh-client \
&& rm -rf /var/lib/apt/lists/*
# Install PowerShell
RUN ARCH=$(uname -m) && \
if [ "$ARCH" = "x86_64" ]; then \
curl -L https://github.com/PowerShell/PowerShell/releases/download/v7.5.0/powershell-7.5.0-linux-x64.tar.gz -o /tmp/powershell.tar.gz ; \
elif [ "$ARCH" = "aarch64" ]; then \
curl -L https://github.com/PowerShell/PowerShell/releases/download/v7.5.0/powershell-7.5.0-linux-arm64.tar.gz -o /tmp/powershell.tar.gz ; \
else \
echo "Unsupported architecture: $ARCH" && exit 1 ; \
fi && \
mkdir -p /opt/microsoft/powershell/7 && \
tar zxf /tmp/powershell.tar.gz -C /opt/microsoft/powershell/7 && \
chmod +x /opt/microsoft/powershell/7/pwsh && \
ln -s /opt/microsoft/powershell/7/pwsh /usr/bin/pwsh && \
rm /tmp/powershell.tar.gz
# Add prowler user
RUN addgroup --gid 1000 prowler && \
adduser --uid 1000 --gid 1000 --disabled-password --gecos "" prowler
USER prowler
WORKDIR /home/prowler
COPY pyproject.toml ./
RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir poetry
COPY src/backend/ ./backend/
ENV PATH="/home/prowler/.local/bin:$PATH"
# Add `--no-root` to avoid installing the current project as a package
RUN poetry install --no-root && \
rm -rf ~/.cache/pip
COPY docker-entrypoint.sh ./docker-entrypoint.sh
WORKDIR /home/prowler/backend
# Development image
FROM build AS dev
USER root
# hadolint ignore=DL3008
RUN apt-get update && apt-get install -y --no-install-recommends \
curl vim \
&& rm -rf /var/lib/apt/lists/*
USER prowler
ENTRYPOINT ["../docker-entrypoint.sh", "dev"]
# Production image
FROM build
ENTRYPOINT ["../docker-entrypoint.sh", "prod"]

View File

@@ -1,334 +0,0 @@
# Description
This repository contains the JSON API and Task Runner components for Prowler, which facilitate a complete backend that interacts with the Prowler SDK and is used by the Prowler UI.
# Components
The Prowler API is composed of the following components:
- The JSON API, which is an API built with Django Rest Framework.
- The Celery worker, which is responsible for executing the background tasks that are defined in the JSON API.
- The PostgreSQL database, which is used to store the data.
- The Valkey database, which is an in-memory database which is used as a message broker for the Celery workers.
## Note about Valkey
[Valkey](https://valkey.io/) is an open source (BSD) high performance key/value datastore.
Valkey exposes a Redis 7.2 compliant API. Any service that exposes the Redis API can be used with Prowler API.
# Modify environment variables
Under the root path of the project, you can find a file called `.env.example`. This file shows all the environment variables that the project uses. You *must* create a new file called `.env` and set the values for the variables.
## Local deployment
Keep in mind if you export the `.env` file to use it with local deployment that you will have to do it within the context of the Poetry interpreter, not before. Otherwise, variables will not be loaded properly.
To do this, you can run:
```console
poetry shell
set -a
source .env
```
# 🚀 Production deployment
## Docker deployment
This method requires `docker` and `docker compose`.
### Clone the repository
```console
# HTTPS
git clone https://github.com/prowler-cloud/api.git
# SSH
git clone git@github.com:prowler-cloud/api.git
```
### Build the base image
```console
docker compose --profile prod build
```
### Run the production service
This command will start the Django production server and the Celery worker and also the Valkey and PostgreSQL databases.
```console
docker compose --profile prod up -d
```
You can access the server in `http://localhost:8080`.
> **NOTE:** notice how the port is different. When developing using docker, the port will be `8080` to prevent conflicts.
### View the Production Server Logs
To view the logs for any component (e.g., Django, Celery worker), you can use the following command with a wildcard. This command will follow logs for any container that matches the specified pattern:
```console
docker logs -f $(docker ps --format "{{.Names}}" | grep 'api-')
## Local deployment
To use this method, you'll need to set up a Python virtual environment (version ">=3.11,<3.13") and keep dependencies updated. Additionally, ensure that `poetry` and `docker compose` are installed.
### Clone the repository
```console
# HTTPS
git clone https://github.com/prowler-cloud/api.git
# SSH
git clone git@github.com:prowler-cloud/api.git
```
### Install all dependencies with Poetry
```console
poetry install
poetry shell
```
## Start the PostgreSQL Database and Valkey
The PostgreSQL database (version 16.3) and Valkey (version 7) are required for the development environment. To make development easier, we have provided a `docker-compose` file that will start these components for you.
**Note:** Make sure to use the specified versions, as there are features in our setup that may not be compatible with older versions of PostgreSQL and Valkey.
```console
docker compose up postgres valkey -d
```
## Deploy Django and the Celery worker
### Run migrations
For migrations, you need to force the `admin` database router. Assuming you have the correct environment variables and Python virtual environment, run:
```console
cd src/backend
python manage.py migrate --database admin
```
### Run the Celery worker
```console
cd src/backend
python -m celery -A config.celery worker -l info -E
```
### Run the Django server with Gunicorn
```console
cd src/backend
gunicorn -c config/guniconf.py config.wsgi:application
```
> By default, the Gunicorn server will try to use as many workers as your machine can handle. You can manually change that in the `src/backend/config/guniconf.py` file.
# 🧪 Development guide
## Local deployment
To use this method, you'll need to set up a Python virtual environment (version ">=3.11,<3.13") and keep dependencies updated. Additionally, ensure that `poetry` and `docker compose` are installed.
### Clone the repository
```console
# HTTPS
git clone https://github.com/prowler-cloud/api.git
# SSH
git clone git@github.com:prowler-cloud/api.git
```
### Start the PostgreSQL Database and Valkey
The PostgreSQL database (version 16.3) and Valkey (version 7) are required for the development environment. To make development easier, we have provided a `docker-compose` file that will start these components for you.
**Note:** Make sure to use the specified versions, as there are features in our setup that may not be compatible with older versions of PostgreSQL and Valkey.
```console
docker compose up postgres valkey -d
```
### Install the Python dependencies
> You must have Poetry installed
```console
poetry install
poetry shell
```
### Apply migrations
For migrations, you need to force the `admin` database router. Assuming you have the correct environment variables and Python virtual environment, run:
```console
cd src/backend
python manage.py migrate --database admin
```
### Run the Django development server
```console
cd src/backend
python manage.py runserver
```
You can access the server in `http://localhost:8000`.
All changes in the code will be automatically reloaded in the server.
### Run the Celery worker
```console
python -m celery -A config.celery worker -l info -E
```
The Celery worker does not detect and reload changes in the code, so you need to restart it manually when you make changes.
## Docker deployment
This method requires `docker` and `docker compose`.
### Clone the repository
```console
# HTTPS
git clone https://github.com/prowler-cloud/api.git
# SSH
git clone git@github.com:prowler-cloud/api.git
```
### Build the base image
```console
docker compose --profile dev build
```
### Run the development service
This command will start the Django development server and the Celery worker and also the Valkey and PostgreSQL databases.
```console
docker compose --profile dev up -d
```
You can access the server in `http://localhost:8080`.
All changes in the code will be automatically reloaded in the server.
> **NOTE:** notice how the port is different. When developing using docker, the port will be `8080` to prevent conflicts.
### View the development server logs
To view the logs for any component (e.g., Django, Celery worker), you can use the following command with a wildcard. This command will follow logs for any container that matches the specified pattern:
```console
docker logs -f $(docker ps --format "{{.Names}}" | grep 'api-')
## Applying migrations
For migrations, you need to force the `admin` database router. Assuming you have the correct environment variables and Python virtual environment, run:
```console
poetry shell
cd src/backend
python manage.py migrate --database admin
```
## Apply fixtures
Fixtures are used to populate the database with initial development data.
```console
poetry shell
cd src/backend
python manage.py loaddata api/fixtures/0_dev_users.json --database admin
```
> The default credentials are `dev@prowler.com:thisisapassword123` or `dev2@prowler.com:thisisapassword123`
## Run tests
Note that the tests will fail if you use the same `.env` file as the development environment.
For best results, run in a new shell with no environment variables set.
```console
poetry shell
cd src/backend
pytest
```
# Custom commands
Django provides a way to create custom commands that can be run from the command line.
> These commands can be found in: ```prowler/api/src/backend/api/management/commands```
To run a custom command, you need to be in the `prowler/api/src/backend` directory and run:
```console
poetry shell
python manage.py <command_name>
```
## Generate dummy data
```console
python manage.py findings --tenant
<TENANT_ID> --findings <NUM_FINDINGS> --re
sources <NUM_RESOURCES> --batch <TRANSACTION_BATCH_SIZE> --alias <ALIAS>
```
This command creates, for a given tenant, a provider, scan and a set of findings and resources related altogether.
> Scan progress and state are updated in real time.
> - 0-33%: Create resources.
> - 33-66%: Create findings.
> - 66%: Create resource-finding mapping.
>
> The last step is required to access the findings details, since the UI needs that to print all the information.
### Example
```console
~/backend $ poetry run python manage.py findings --tenant
fffb1893-3fc7-4623-a5d9-fae47da1c528 --findings 25000 --re
sources 1000 --batch 5000 --alias test-script
Starting data population
Tenant: fffb1893-3fc7-4623-a5d9-fae47da1c528
Alias: test-script
Resources: 1000
Findings: 25000
Batch size: 5000
Creating resources...
100%|███████████████████████| 1/1 [00:00<00:00, 7.72it/s]
Resources created successfully.
Creating findings...
100%|███████████████████████| 5/5 [00:05<00:00, 1.09s/it]
Findings created successfully.
Creating resource-finding mappings...
100%|███████████████████████| 5/5 [00:02<00:00, 1.81it/s]
Resource-finding mappings created successfully.
Successfully populated test data.
```

View File

@@ -1,125 +0,0 @@
services:
api:
build:
dockerfile: Dockerfile
image: prowler-api
env_file:
- path: ./.env
required: false
ports:
- "${DJANGO_PORT:-8000}:${DJANGO_PORT:-8000}"
profiles:
- prod
depends_on:
postgres:
condition: service_healthy
valkey:
condition: service_healthy
entrypoint:
- "../docker-entrypoint.sh"
- "prod"
api-dev:
build:
dockerfile: Dockerfile
target: dev
image: prowler-api-dev
environment:
- DJANGO_SETTINGS_MODULE=config.django.devel
- DJANGO_LOGGING_FORMATTER=human_readable
env_file:
- path: ./.env
required: false
ports:
- "${DJANGO_PORT:-8080}:${DJANGO_PORT:-8080}"
volumes:
- "./src/backend:/home/prowler/backend"
- "./pyproject.toml:/home/prowler/pyproject.toml"
profiles:
- dev
depends_on:
postgres:
condition: service_healthy
valkey:
condition: service_healthy
entrypoint:
- "../docker-entrypoint.sh"
- "dev"
postgres:
image: postgres:16.3-alpine
ports:
- "${POSTGRES_PORT:-5432}:${POSTGRES_PORT:-5432}"
hostname: "postgres-db"
volumes:
- ./_data/postgres:/var/lib/postgresql/data
environment:
- POSTGRES_USER=${POSTGRES_ADMIN_USER:-prowler}
- POSTGRES_PASSWORD=${POSTGRES_ADMIN_PASSWORD:-S3cret}
- POSTGRES_DB=${POSTGRES_DB:-prowler_db}
env_file:
- path: ./.env
required: false
healthcheck:
test: ["CMD-SHELL", "sh -c 'pg_isready -U ${POSTGRES_ADMIN_USER:-prowler} -d ${POSTGRES_DB:-prowler_db}'"]
interval: 5s
timeout: 5s
retries: 5
valkey:
image: valkey/valkey:7-alpine3.19
ports:
- "${VALKEY_PORT:-6379}:6379"
hostname: "valkey"
volumes:
- ./_data/valkey:/data
env_file:
- path: ./.env
required: false
healthcheck:
test: ["CMD-SHELL", "sh -c 'valkey-cli ping'"]
interval: 10s
timeout: 5s
retries: 3
worker:
build:
dockerfile: Dockerfile
image: prowler-worker
environment:
- DJANGO_SETTINGS_MODULE=${DJANGO_SETTINGS_MODULE:-config.django.production}
env_file:
- path: ./.env
required: false
profiles:
- dev
- prod
depends_on:
valkey:
condition: service_healthy
postgres:
condition: service_healthy
entrypoint:
- "../docker-entrypoint.sh"
- "worker"
worker-beat:
build:
dockerfile: Dockerfile
image: prowler-worker
environment:
- DJANGO_SETTINGS_MODULE=${DJANGO_SETTINGS_MODULE:-config.django.production}
env_file:
- path: ./.env
required: false
profiles:
- dev
- prod
depends_on:
valkey:
condition: service_healthy
postgres:
condition: service_healthy
entrypoint:
- "../docker-entrypoint.sh"
- "beat"

View File

@@ -1,71 +0,0 @@
#!/bin/sh
apply_migrations() {
echo "Applying database migrations..."
poetry run python manage.py migrate --database admin
}
apply_fixtures() {
echo "Applying Django fixtures..."
for fixture in api/fixtures/dev/*.json; do
if [ -f "$fixture" ]; then
echo "Loading $fixture"
poetry run python manage.py loaddata "$fixture" --database admin
fi
done
}
start_dev_server() {
echo "Starting the development server..."
poetry run python manage.py runserver 0.0.0.0:"${DJANGO_PORT:-8080}"
}
start_prod_server() {
echo "Starting the Gunicorn server..."
poetry run gunicorn -c config/guniconf.py config.wsgi:application
}
start_worker() {
echo "Starting the worker..."
poetry run python -m celery -A config.celery worker -l "${DJANGO_LOGGING_LEVEL:-info}" -Q celery,scans,scan-reports,deletion -E --max-tasks-per-child 1
}
start_worker_beat() {
echo "Starting the worker-beat..."
sleep 15
poetry run python -m celery -A config.celery beat -l "${DJANGO_LOGGING_LEVEL:-info}" --scheduler django_celery_beat.schedulers:DatabaseScheduler
}
manage_db_partitions() {
if [ "${DJANGO_MANAGE_DB_PARTITIONS}" = "True" ]; then
echo "Managing DB partitions..."
# For now we skip the deletion of partitions until we define the data retention policy
# --yes auto approves the operation without the need of an interactive terminal
poetry run python manage.py pgpartition --using admin --skip-delete --yes
fi
}
case "$1" in
dev)
apply_migrations
apply_fixtures
manage_db_partitions
start_dev_server
;;
prod)
apply_migrations
manage_db_partitions
start_prod_server
;;
worker)
start_worker
;;
beat)
start_worker_beat
;;
*)
echo "Usage: $0 {dev|prod|worker|beat}"
exit 1
;;
esac

View File

@@ -1,65 +0,0 @@
# Partitions
## Overview
Partitions are used to split the data in a table into smaller chunks, allowing for more efficient querying and storage.
The Prowler API uses partitions to store findings. The partitions are created based on the UUIDv7 `id` field.
You can use the Prowler API without ever creating additional partitions. This documentation is only relevant if you want to manage partitions to gain additional query performance.
### Required Postgres Configuration
There are 3 configuration options that need to be set in the `postgres.conf` file to get the most performance out of the partitioning:
- `enable_partition_pruning = on` (default is on)
- `enable_partitionwise_join = on` (default is off)
- `enable_partitionwise_aggregate = on` (default is off)
For more information on these options, see the [Postgres documentation](https://www.postgresql.org/docs/current/runtime-config-query.html).
## Partitioning Strategy
The partitioning strategy is defined in the `api.partitions` module. The strategy is responsible for creating and deleting partitions based on the provided configuration.
## Managing Partitions
The application will run without any extra work on your part. If you want to add or delete partitions, you can use the following commands:
To manage the partitions, run `python manage.py pgpartition --using admin`
This command will generate a list of partitions to create and delete based on the provided configuration.
By default, the command will prompt you to accept the changes before applying them.
```shell
Finding:
+ 2024_nov
name: 2024_nov
from_values: 0192e505-9000-72c8-a47c-cce719d8fb93
to_values: 01937f84-5418-7eb8-b2a6-e3be749e839d
size_unit: months
size_value: 1
+ 2024_dec
name: 2024_dec
from_values: 01937f84-5800-7b55-879c-9cdb46f023f6
to_values: 01941f29-7818-7f9f-b4be-20b05bb2f574
size_unit: months
size_value: 1
0 partitions will be deleted
2 partitions will be created
```
If you choose to apply the partitions, tables will be generated with the following format: `<table_name>_<year>_<month>`.
For more info on the partitioning manager, see https://github.com/SectorLabs/django-postgres-extra
### Changing the Partitioning Parameters
There are 4 environment variables that can be used to change the partitioning parameters:
- `DJANGO_MANAGE_DB_PARTITIONS`: Allow Django to manage database partitons. By default is set to `False`.
- `FINDINGS_TABLE_PARTITION_MONTHS`: Set the months for each partition. Setting the partition monts to 1 will create partitions with a size of 1 natural month.
- `FINDINGS_TABLE_PARTITION_COUNT`: Set the number of partitions to create
- `FINDINGS_TABLE_PARTITION_MAX_AGE_MONTHS`: Set the number of months to keep partitions before deleting them. Setting this to `None` will keep partitions indefinitely.

5486
api/poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,61 +0,0 @@
[build-system]
build-backend = "poetry.core.masonry.api"
requires = ["poetry-core"]
[project]
authors = [{name = "Prowler Engineering", email = "engineering@prowler.com"}]
dependencies = [
"celery[pytest] (>=5.4.0,<6.0.0)",
"dj-rest-auth[with_social,jwt] (==7.0.1)",
"django==5.1.7",
"django-allauth==65.4.1",
"django-celery-beat (>=2.7.0,<3.0.0)",
"django-celery-results (>=2.5.1,<3.0.0)",
"django-cors-headers==4.4.0",
"django-environ==0.11.2",
"django-filter==24.3",
"django-guid==3.5.0",
"django-postgres-extra (>=2.0.8,<3.0.0)",
"djangorestframework==3.15.2",
"djangorestframework-jsonapi==7.0.2",
"djangorestframework-simplejwt (>=5.3.1,<6.0.0)",
"drf-nested-routers (>=0.94.1,<1.0.0)",
"drf-spectacular==0.27.2",
"drf-spectacular-jsonapi==0.5.1",
"gunicorn==23.0.0",
"prowler @ git+https://github.com/prowler-cloud/prowler.git@master",
"psycopg2-binary==2.9.9",
"pytest-celery[redis] (>=1.0.1,<2.0.0)",
"sentry-sdk[django] (>=2.20.0,<3.0.0)",
"uuid6==2024.7.10"
]
description = "Prowler's API (Django/DRF)"
license = "Apache-2.0"
name = "prowler-api"
package-mode = false
# Needed for the SDK compatibility
requires-python = ">=3.11,<3.13"
version = "1.6.0"
[project.scripts]
celery = "src.backend.config.settings.celery"
[tool.poetry.group.dev.dependencies]
bandit = "1.7.9"
coverage = "7.5.4"
django-silk = "5.3.2"
docker = "7.1.0"
freezegun = "1.5.1"
marshmallow = ">=3.15.0,<4.0.0"
mypy = "1.10.1"
pylint = "3.2.5"
pytest = "8.2.2"
pytest-cov = "5.0.0"
pytest-django = "4.8.0"
pytest-env = "1.1.3"
pytest-randomly = "3.15.0"
pytest-xdist = "3.6.1"
ruff = "0.5.0"
safety = "3.2.9"
tqdm = "4.67.1"
vulture = "2.14"

View File

View File

@@ -1,61 +0,0 @@
from allauth.socialaccount.adapter import DefaultSocialAccountAdapter
from django.db import transaction
from api.db_router import MainRouter
from api.db_utils import rls_transaction
from api.models import Membership, Role, Tenant, User, UserRoleRelationship
class ProwlerSocialAccountAdapter(DefaultSocialAccountAdapter):
@staticmethod
def get_user_by_email(email: str):
try:
return User.objects.get(email=email)
except User.DoesNotExist:
return None
def pre_social_login(self, request, sociallogin):
# Link existing accounts with the same email address
email = sociallogin.account.extra_data.get("email")
if email:
existing_user = self.get_user_by_email(email)
if existing_user:
sociallogin.connect(request, existing_user)
def save_user(self, request, sociallogin, form=None):
"""
Called after the user data is fully populated from the provider
and is about to be saved to the DB for the first time.
"""
with transaction.atomic(using=MainRouter.admin_db):
user = super().save_user(request, sociallogin, form)
user.save(using=MainRouter.admin_db)
social_account_name = sociallogin.account.extra_data.get("name")
if social_account_name:
user.name = social_account_name
user.save(using=MainRouter.admin_db)
tenant = Tenant.objects.using(MainRouter.admin_db).create(
name=f"{user.email.split('@')[0]} default tenant"
)
with rls_transaction(str(tenant.id)):
Membership.objects.using(MainRouter.admin_db).create(
user=user, tenant=tenant, role=Membership.RoleChoices.OWNER
)
role = Role.objects.using(MainRouter.admin_db).create(
name="admin",
tenant_id=tenant.id,
manage_users=True,
manage_account=True,
manage_billing=True,
manage_providers=True,
manage_integrations=True,
manage_scans=True,
unlimited_visibility=True,
)
UserRoleRelationship.objects.using(MainRouter.admin_db).create(
user=user,
role=role,
tenant_id=tenant.id,
)
return user

View File

@@ -1,3 +0,0 @@
# from django.contrib import admin
# Register your models here.

View File

@@ -1,12 +0,0 @@
from django.apps import AppConfig
class ApiConfig(AppConfig):
default_auto_field = "django.db.models.BigAutoField"
name = "api"
def ready(self):
from api import signals # noqa: F401
from api.compliance import load_prowler_compliance
load_prowler_compliance()

View File

@@ -1,152 +0,0 @@
from django.core.exceptions import ObjectDoesNotExist
from django.db import transaction
from rest_framework import permissions
from rest_framework.exceptions import NotAuthenticated
from rest_framework.filters import SearchFilter
from rest_framework_json_api import filters
from rest_framework_json_api.views import ModelViewSet
from rest_framework_simplejwt.authentication import JWTAuthentication
from api.db_router import MainRouter
from api.db_utils import POSTGRES_USER_VAR, rls_transaction
from api.filters import CustomDjangoFilterBackend
from api.models import Role, Tenant
from api.rbac.permissions import HasPermissions
class BaseViewSet(ModelViewSet):
authentication_classes = [JWTAuthentication]
required_permissions = []
permission_classes = [permissions.IsAuthenticated, HasPermissions]
filter_backends = [
filters.QueryParameterValidationFilter,
filters.OrderingFilter,
CustomDjangoFilterBackend,
SearchFilter,
]
filterset_fields = []
search_fields = []
ordering_fields = "__all__"
ordering = ["id"]
def initial(self, request, *args, **kwargs):
"""
Sets required_permissions before permissions are checked.
"""
self.set_required_permissions()
super().initial(request, *args, **kwargs)
def set_required_permissions(self):
"""This is an abstract method that must be implemented by subclasses."""
NotImplemented
def get_queryset(self):
raise NotImplementedError
class BaseRLSViewSet(BaseViewSet):
def dispatch(self, request, *args, **kwargs):
with transaction.atomic():
return super().dispatch(request, *args, **kwargs)
def initial(self, request, *args, **kwargs):
# Ideally, this logic would be in the `.setup()` method but DRF view sets don't call it
# https://docs.djangoproject.com/en/5.1/ref/class-based-views/base/#django.views.generic.base.View.setup
if request.auth is None:
raise NotAuthenticated
tenant_id = request.auth.get("tenant_id")
if tenant_id is None:
raise NotAuthenticated("Tenant ID is not present in token")
with rls_transaction(tenant_id):
self.request.tenant_id = tenant_id
return super().initial(request, *args, **kwargs)
def get_serializer_context(self):
context = super().get_serializer_context()
context["tenant_id"] = self.request.tenant_id
return context
class BaseTenantViewset(BaseViewSet):
def dispatch(self, request, *args, **kwargs):
with transaction.atomic():
tenant = super().dispatch(request, *args, **kwargs)
try:
# If the request is a POST, create the admin role
if request.method == "POST":
isinstance(tenant, dict) and self._create_admin_role(tenant.data["id"])
except Exception as e:
self._handle_creation_error(e, tenant)
raise
return tenant
def _create_admin_role(self, tenant_id):
Role.objects.using(MainRouter.admin_db).create(
name="admin",
tenant_id=tenant_id,
manage_users=True,
manage_account=True,
manage_billing=True,
manage_providers=True,
manage_integrations=True,
manage_scans=True,
unlimited_visibility=True,
)
def _handle_creation_error(self, error, tenant):
if tenant.data.get("id"):
try:
Tenant.objects.using(MainRouter.admin_db).filter(
id=tenant.data["id"]
).delete()
except ObjectDoesNotExist:
pass # Tenant might not exist, handle gracefully
def initial(self, request, *args, **kwargs):
if (
request.resolver_match.url_name != "tenant-detail"
and request.method != "DELETE"
):
user_id = str(request.user.id)
with rls_transaction(value=user_id, parameter=POSTGRES_USER_VAR):
return super().initial(request, *args, **kwargs)
# TODO: DRY this when we have time
if request.auth is None:
raise NotAuthenticated
tenant_id = request.auth.get("tenant_id")
if tenant_id is None:
raise NotAuthenticated("Tenant ID is not present in token")
with rls_transaction(tenant_id):
self.request.tenant_id = tenant_id
return super().initial(request, *args, **kwargs)
class BaseUserViewset(BaseViewSet):
def dispatch(self, request, *args, **kwargs):
with transaction.atomic():
return super().dispatch(request, *args, **kwargs)
def initial(self, request, *args, **kwargs):
# TODO refactor after improving RLS on users
if request.stream is not None and request.stream.method == "POST":
return super().initial(request, *args, **kwargs)
if request.auth is None:
raise NotAuthenticated
tenant_id = request.auth.get("tenant_id")
if tenant_id is None:
raise NotAuthenticated("Tenant ID is not present in token")
with rls_transaction(tenant_id):
self.request.tenant_id = tenant_id
return super().initial(request, *args, **kwargs)

View File

@@ -1,209 +0,0 @@
from types import MappingProxyType
from prowler.lib.check.compliance_models import Compliance
from prowler.lib.check.models import CheckMetadata
from api.models import Provider
PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE = {}
PROWLER_CHECKS = {}
def get_prowler_provider_checks(provider_type: Provider.ProviderChoices):
"""
Retrieve all check IDs for the specified provider type.
This function fetches the check metadata for the given cloud provider
and returns an iterable of check IDs.
Args:
provider_type (Provider.ProviderChoices): The provider type
(e.g., 'aws', 'azure') for which to retrieve check IDs.
Returns:
Iterable[str]: An iterable of check IDs associated with the specified provider type.
"""
return CheckMetadata.get_bulk(provider_type).keys()
def get_prowler_provider_compliance(provider_type: Provider.ProviderChoices) -> dict:
"""
Retrieve the Prowler compliance data for a specified provider type.
This function fetches the compliance frameworks and their associated
requirements for the given cloud provider.
Args:
provider_type (Provider.ProviderChoices): The provider type
(e.g., 'aws', 'azure') for which to retrieve compliance data.
Returns:
dict: A dictionary mapping compliance framework names to their respective
Compliance objects for the specified provider.
"""
return Compliance.get_bulk(provider_type)
def load_prowler_compliance():
"""
Load and initialize the Prowler compliance data and checks for all provider types.
This function retrieves compliance data for all supported provider types,
generates a compliance overview template, and populates the global variables
`PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE` and `PROWLER_CHECKS` with read-only mappings
of the compliance templates and checks, respectively.
"""
global PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE
global PROWLER_CHECKS
prowler_compliance = {
provider_type: get_prowler_provider_compliance(provider_type)
for provider_type in Provider.ProviderChoices.values
}
template = generate_compliance_overview_template(prowler_compliance)
PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE = MappingProxyType(template)
PROWLER_CHECKS = MappingProxyType(load_prowler_checks(prowler_compliance))
def load_prowler_checks(prowler_compliance):
"""
Generate a mapping of checks to the compliance frameworks that include them.
This function processes the provided compliance data and creates a dictionary
mapping each provider type to a dictionary where each check ID maps to a set
of compliance names that include that check.
Args:
prowler_compliance (dict): The compliance data for all provider types,
as returned by `get_prowler_provider_compliance`.
Returns:
dict: A nested dictionary where the first-level keys are provider types,
and the values are dictionaries mapping check IDs to sets of compliance names.
"""
checks = {}
for provider_type in Provider.ProviderChoices.values:
checks[provider_type] = {
check_id: set() for check_id in get_prowler_provider_checks(provider_type)
}
for compliance_name, compliance_data in prowler_compliance[
provider_type
].items():
for requirement in compliance_data.Requirements:
for check in requirement.Checks:
try:
checks[provider_type][check].add(compliance_name)
except KeyError:
continue
return checks
def generate_scan_compliance(
compliance_overview, provider_type: str, check_id: str, status: str
):
"""
Update the compliance overview with the status of a specific check.
This function updates the compliance overview by setting the status of the given check
within all compliance frameworks and requirements that include it. It then updates the
requirement status to 'FAIL' if any of its checks have failed, and adjusts the counts
of passed and failed requirements in the compliance overview.
Args:
compliance_overview (dict): The compliance overview data structure to update.
provider_type (str): The provider type (e.g., 'aws', 'azure') associated with the check.
check_id (str): The identifier of the check whose status is being updated.
status (str): The status of the check (e.g., 'PASS', 'FAIL', 'MUTED').
Returns:
None: This function modifies the compliance_overview in place.
"""
for compliance_id in PROWLER_CHECKS[provider_type][check_id]:
for requirement in compliance_overview[compliance_id]["requirements"].values():
if check_id in requirement["checks"]:
requirement["checks"][check_id] = status
requirement["checks_status"][status.lower()] += 1
if requirement["status"] != "FAIL" and any(
value == "FAIL" for value in requirement["checks"].values()
):
requirement["status"] = "FAIL"
compliance_overview[compliance_id]["requirements_status"]["passed"] -= 1
compliance_overview[compliance_id]["requirements_status"]["failed"] += 1
def generate_compliance_overview_template(prowler_compliance: dict):
"""
Generate a compliance overview template for all provider types.
This function creates a nested dictionary structure representing the compliance
overview template for each provider type, compliance framework, and requirement.
It initializes the status of all checks and requirements, and calculates initial
counts for requirements status.
Args:
prowler_compliance (dict): The compliance data for all provider types,
as returned by `get_prowler_provider_compliance`.
Returns:
dict: A nested dictionary representing the compliance overview template,
structured by provider type and compliance framework.
"""
template = {}
for provider_type in Provider.ProviderChoices.values:
provider_compliance = template.setdefault(provider_type, {})
compliance_data_dict = prowler_compliance[provider_type]
for compliance_name, compliance_data in compliance_data_dict.items():
compliance_requirements = {}
requirements_status = {"passed": 0, "failed": 0, "manual": 0}
total_requirements = 0
for requirement in compliance_data.Requirements:
total_requirements += 1
total_checks = len(requirement.Checks)
checks_dict = {check: None for check in requirement.Checks}
# Build requirement dictionary
requirement_dict = {
"name": requirement.Name or requirement.Id,
"description": requirement.Description,
"attributes": [
dict(attribute) for attribute in requirement.Attributes
],
"checks": checks_dict,
"checks_status": {
"pass": 0,
"fail": 0,
"manual": 0,
"total": total_checks,
},
"status": "PASS",
}
# Update requirements status
if total_checks == 0:
requirements_status["manual"] += 1
# Add requirement to compliance requirements
compliance_requirements[requirement.Id] = requirement_dict
# Calculate pending requirements
pending_requirements = total_requirements - requirements_status["manual"]
requirements_status["passed"] = pending_requirements
# Build compliance dictionary
compliance_dict = {
"framework": compliance_data.Framework,
"version": compliance_data.Version,
"provider": provider_type,
"description": compliance_data.Description,
"requirements": compliance_requirements,
"requirements_status": requirements_status,
"total_requirements": total_requirements,
}
# Add compliance to provider compliance
provider_compliance[compliance_name] = compliance_dict
return template

View File

@@ -1,29 +0,0 @@
ALLOWED_APPS = ("django", "socialaccount", "account", "authtoken", "silk")
class MainRouter:
default_db = "default"
admin_db = "admin"
def db_for_read(self, model, **hints): # noqa: F841
model_table_name = model._meta.db_table
if model_table_name.startswith("django_") or any(
model_table_name.startswith(f"{app}_") for app in ALLOWED_APPS
):
return self.admin_db
return None
def db_for_write(self, model, **hints): # noqa: F841
model_table_name = model._meta.db_table
if any(model_table_name.startswith(f"{app}_") for app in ALLOWED_APPS):
return self.admin_db
return None
def allow_migrate(self, db, app_label, model_name=None, **hints): # noqa: F841
return db == self.admin_db
def allow_relation(self, obj1, obj2, **hints): # noqa: F841
# Allow relations if both objects are in either "default" or "admin" db connectors
if {obj1._state.db, obj2._state.db} <= {self.default_db, self.admin_db}:
return True
return None

View File

@@ -1,347 +0,0 @@
import secrets
import uuid
from contextlib import contextmanager
from datetime import datetime, timedelta, timezone
from django.conf import settings
from django.contrib.auth.models import BaseUserManager
from django.db import connection, models, transaction
from django_celery_beat.models import PeriodicTask
from psycopg2 import connect as psycopg2_connect
from psycopg2.extensions import AsIs, new_type, register_adapter, register_type
from rest_framework_json_api.serializers import ValidationError
DB_USER = settings.DATABASES["default"]["USER"] if not settings.TESTING else "test"
DB_PASSWORD = (
settings.DATABASES["default"]["PASSWORD"] if not settings.TESTING else "test"
)
DB_PROWLER_USER = (
settings.DATABASES["prowler_user"]["USER"] if not settings.TESTING else "test"
)
DB_PROWLER_PASSWORD = (
settings.DATABASES["prowler_user"]["PASSWORD"] if not settings.TESTING else "test"
)
TASK_RUNNER_DB_TABLE = "django_celery_results_taskresult"
POSTGRES_TENANT_VAR = "api.tenant_id"
POSTGRES_USER_VAR = "api.user_id"
SET_CONFIG_QUERY = "SELECT set_config(%s, %s::text, TRUE);"
@contextmanager
def psycopg_connection(database_alias: str):
psycopg2_connection = None
try:
admin_db = settings.DATABASES[database_alias]
psycopg2_connection = psycopg2_connect(
dbname=admin_db["NAME"],
user=admin_db["USER"],
password=admin_db["PASSWORD"],
host=admin_db["HOST"],
port=admin_db["PORT"],
)
yield psycopg2_connection
finally:
if psycopg2_connection is not None:
psycopg2_connection.close()
@contextmanager
def rls_transaction(value: str, parameter: str = POSTGRES_TENANT_VAR):
"""
Creates a new database transaction setting the given configuration value for Postgres RLS. It validates the
if the value is a valid UUID.
Args:
value (str): Database configuration parameter value.
parameter (str): Database configuration parameter name, by default is 'api.tenant_id'.
"""
with transaction.atomic():
with connection.cursor() as cursor:
try:
# just in case the value is an UUID object
uuid.UUID(str(value))
except ValueError:
raise ValidationError("Must be a valid UUID")
cursor.execute(SET_CONFIG_QUERY, [parameter, value])
yield cursor
class CustomUserManager(BaseUserManager):
def create_user(self, email, password=None, **extra_fields):
if not email:
raise ValueError("The email field must be set")
email = self.normalize_email(email)
user = self.model(email=email, **extra_fields)
user.set_password(password)
user.save(using=self._db)
return user
def get_by_natural_key(self, email):
return self.get(email__iexact=email)
def enum_to_choices(enum_class):
"""
This function converts a Python Enum to a list of tuples, where the first element is the value and the second element is the name.
It's for use with Django's `choices` attribute, which expects a list of tuples.
"""
return [(item.value, item.name.replace("_", " ").title()) for item in enum_class]
def one_week_from_now():
"""
Return a datetime object with a date one week from now.
"""
return datetime.now(timezone.utc) + timedelta(days=7)
def generate_random_token(length: int = 14, symbols: str | None = None) -> str:
"""
Generate a random token with the specified length.
"""
_symbols = "23456789ABCDEFGHJKMNPQRSTVWXYZ"
return "".join(secrets.choice(symbols or _symbols) for _ in range(length))
def batch_delete(tenant_id, queryset, batch_size=settings.DJANGO_DELETION_BATCH_SIZE):
"""
Deletes objects in batches and returns the total number of deletions and a summary.
Args:
tenant_id (str): Tenant ID the queryset belongs to.
queryset (QuerySet): The queryset of objects to delete.
batch_size (int): The number of objects to delete in each batch.
Returns:
tuple: (total_deleted, deletion_summary)
"""
total_deleted = 0
deletion_summary = {}
while True:
with rls_transaction(tenant_id, POSTGRES_TENANT_VAR):
# Get a batch of IDs to delete
batch_ids = set(
queryset.values_list("id", flat=True).order_by("id")[:batch_size]
)
if not batch_ids:
# No more objects to delete
break
deleted_count, deleted_info = queryset.filter(id__in=batch_ids).delete()
total_deleted += deleted_count
for model_label, count in deleted_info.items():
deletion_summary[model_label] = deletion_summary.get(model_label, 0) + count
return total_deleted, deletion_summary
def delete_related_daily_task(provider_id: str):
"""
Deletes the periodic task associated with a specific provider.
Args:
provider_id (str): The unique identifier for the provider
whose related periodic task should be deleted.
"""
task_name = f"scan-perform-scheduled-{provider_id}"
PeriodicTask.objects.filter(name=task_name).delete()
# Postgres Enums
class PostgresEnumMigration:
def __init__(self, enum_name: str, enum_values: tuple):
self.enum_name = enum_name
self.enum_values = enum_values
def create_enum_type(self, apps, schema_editor): # noqa: F841
string_enum_values = ", ".join([f"'{value}'" for value in self.enum_values])
with schema_editor.connection.cursor() as cursor:
cursor.execute(
f"CREATE TYPE {self.enum_name} AS ENUM ({string_enum_values});"
)
def drop_enum_type(self, apps, schema_editor): # noqa: F841
with schema_editor.connection.cursor() as cursor:
cursor.execute(f"DROP TYPE {self.enum_name};")
class PostgresEnumField(models.Field):
def __init__(self, enum_type_name, *args, **kwargs):
self.enum_type_name = enum_type_name
super().__init__(*args, **kwargs)
def db_type(self, connection):
return self.enum_type_name
def from_db_value(self, value, expression, connection): # noqa: F841
return value
def to_python(self, value):
if isinstance(value, EnumType):
return value.value
return value
def get_prep_value(self, value):
if isinstance(value, EnumType):
return value.value
return value
class EnumType:
def __init__(self, value):
self.value = value
def __str__(self):
return self.value
def enum_adapter(enum_obj):
return AsIs(f"'{enum_obj.value}'::{enum_obj.__class__.enum_type_name}")
def get_enum_oid(connection, enum_type_name: str):
with connection.cursor() as cursor:
cursor.execute("SELECT oid FROM pg_type WHERE typname = %s;", (enum_type_name,))
result = cursor.fetchone()
if result is None:
raise ValueError(f"Enum type '{enum_type_name}' not found")
return result[0]
def register_enum(apps, schema_editor, enum_class): # noqa: F841
with psycopg_connection(schema_editor.connection.alias) as connection:
enum_oid = get_enum_oid(connection, enum_class.enum_type_name)
enum_instance = new_type(
(enum_oid,),
enum_class.enum_type_name,
lambda value, cur: value, # noqa: F841
)
register_type(enum_instance, connection)
register_adapter(enum_class, enum_adapter)
# Postgres enum definition for member role
class MemberRoleEnum(EnumType):
enum_type_name = "member_role"
class MemberRoleEnumField(PostgresEnumField):
def __init__(self, *args, **kwargs):
super().__init__("member_role", *args, **kwargs)
# Postgres enum definition for Provider.provider
class ProviderEnum(EnumType):
enum_type_name = "provider"
class ProviderEnumField(PostgresEnumField):
def __init__(self, *args, **kwargs):
super().__init__("provider", *args, **kwargs)
# Postgres enum definition for Scan.type
class ScanTriggerEnum(EnumType):
enum_type_name = "scan_trigger"
class ScanTriggerEnumField(PostgresEnumField):
def __init__(self, *args, **kwargs):
super().__init__("scan_trigger", *args, **kwargs)
# Postgres enum definition for state
class StateEnum(EnumType):
enum_type_name = "state"
class StateEnumField(PostgresEnumField):
def __init__(self, *args, **kwargs):
super().__init__("state", *args, **kwargs)
# Postgres enum definition for Finding.Delta
class FindingDeltaEnum(EnumType):
enum_type_name = "finding_delta"
class FindingDeltaEnumField(PostgresEnumField):
def __init__(self, *args, **kwargs):
super().__init__("finding_delta", *args, **kwargs)
# Postgres enum definition for Severity
class SeverityEnum(EnumType):
enum_type_name = "severity"
class SeverityEnumField(PostgresEnumField):
def __init__(self, *args, **kwargs):
super().__init__("severity", *args, **kwargs)
# Postgres enum definition for Status
class StatusEnum(EnumType):
enum_type_name = "status"
class StatusEnumField(PostgresEnumField):
def __init__(self, *args, **kwargs):
super().__init__("status", *args, **kwargs)
# Postgres enum definition for Provider secrets type
class ProviderSecretTypeEnum(EnumType):
enum_type_name = "provider_secret_type"
class ProviderSecretTypeEnumField(PostgresEnumField):
def __init__(self, *args, **kwargs):
super().__init__("provider_secret_type", *args, **kwargs)
# Postgres enum definition for Provider secrets type
class InvitationStateEnum(EnumType):
enum_type_name = "invitation_state"
class InvitationStateEnumField(PostgresEnumField):
def __init__(self, *args, **kwargs):
super().__init__("invitation_state", *args, **kwargs)
# Postgres enum definition for Integration type
class IntegrationTypeEnum(EnumType):
enum_type_name = "integration_type"
class IntegrationTypeEnumField(PostgresEnumField):
def __init__(self, *args, **kwargs):
super().__init__("integration_type", *args, **kwargs)

View File

@@ -1,68 +0,0 @@
import uuid
from functools import wraps
from django.db import connection, transaction
from rest_framework_json_api.serializers import ValidationError
from api.db_utils import POSTGRES_TENANT_VAR, SET_CONFIG_QUERY
def set_tenant(func=None, *, keep_tenant=False):
"""
Decorator to set the tenant context for a Celery task based on the provided tenant_id.
This decorator extracts the `tenant_id` from the task's keyword arguments,
and uses it to set the tenant context for the current database session.
The `tenant_id` is then removed from the kwargs before the task function
is executed. If `tenant_id` is not provided, a KeyError is raised.
Args:
func (function): The Celery task function to be decorated.
Raises:
KeyError: If `tenant_id` is not found in the task's keyword arguments.
Returns:
function: The wrapped function with tenant context set.
Example:
# This decorator MUST be defined the last in the decorator chain
@shared_task
@set_tenant
def some_task(arg1, **kwargs):
# Task logic here
pass
# When calling the task
some_task.delay(arg1, tenant_id="8db7ca86-03cc-4d42-99f6-5e480baf6ab5")
# The tenant context will be set before the task logic executes.
"""
def decorator(func):
@wraps(func)
@transaction.atomic
def wrapper(*args, **kwargs):
try:
if not keep_tenant:
tenant_id = kwargs.pop("tenant_id")
else:
tenant_id = kwargs["tenant_id"]
except KeyError:
raise KeyError("This task requires the tenant_id")
try:
uuid.UUID(tenant_id)
except ValueError:
raise ValidationError("Tenant ID must be a valid UUID")
with connection.cursor() as cursor:
cursor.execute(SET_CONFIG_QUERY, [POSTGRES_TENANT_VAR, tenant_id])
return func(*args, **kwargs)
return wrapper
if func is None:
return decorator
else:
return decorator(func)

View File

@@ -1,45 +0,0 @@
from django.core.exceptions import ValidationError as django_validation_error
from rest_framework import status
from rest_framework.exceptions import APIException
from rest_framework_json_api.exceptions import exception_handler
from rest_framework_json_api.serializers import ValidationError
from rest_framework_simplejwt.exceptions import TokenError, InvalidToken
class ModelValidationError(ValidationError):
def __init__(
self,
detail: str | None = None,
code: str | None = None,
pointer: str | None = None,
status_code: int = 400,
):
super().__init__(
detail=[
{
"detail": detail,
"status": str(status_code),
"source": {"pointer": pointer},
"code": code,
}
]
)
class InvitationTokenExpiredException(APIException):
status_code = status.HTTP_410_GONE
default_detail = "The invitation token has expired and is no longer valid."
default_code = "token_expired"
def custom_exception_handler(exc, context):
if isinstance(exc, django_validation_error):
if hasattr(exc, "error_dict"):
exc = ValidationError(exc.message_dict)
else:
exc = ValidationError(detail=exc.messages[0], code=exc.code)
elif isinstance(exc, (TokenError, InvalidToken)):
exc.detail["messages"] = [
message_item["message"] for message_item in exc.detail["messages"]
]
return exception_handler(exc, context)

View File

@@ -1,662 +0,0 @@
from datetime import date, datetime, timedelta, timezone
from django.conf import settings
from django.db.models import Q
from django_filters.rest_framework import (
BaseInFilter,
BooleanFilter,
CharFilter,
ChoiceFilter,
DateFilter,
FilterSet,
UUIDFilter,
)
from rest_framework_json_api.django_filters.backends import DjangoFilterBackend
from rest_framework_json_api.serializers import ValidationError
from api.db_utils import (
FindingDeltaEnumField,
InvitationStateEnumField,
ProviderEnumField,
SeverityEnumField,
StatusEnumField,
)
from api.models import (
ComplianceOverview,
Finding,
Integration,
Invitation,
Membership,
PermissionChoices,
Provider,
ProviderGroup,
ProviderSecret,
Resource,
ResourceTag,
Role,
Scan,
ScanSummary,
SeverityChoices,
StateChoices,
StatusChoices,
Task,
User,
)
from api.rls import Tenant
from api.uuid_utils import (
datetime_to_uuid7,
transform_into_uuid7,
uuid7_end,
uuid7_range,
uuid7_start,
)
from api.v1.serializers import TaskBase
class CustomDjangoFilterBackend(DjangoFilterBackend):
def to_html(self, _request, _queryset, _view):
"""Override this method to use the Browsable API in dev environments.
This disables the HTML render for the default filter.
"""
return None
def get_filterset_class(self, view, queryset=None):
# Check if the view has 'get_filterset_class' method
if hasattr(view, "get_filterset_class"):
return view.get_filterset_class()
# Fallback to the default implementation
return super().get_filterset_class(view, queryset)
class UUIDInFilter(BaseInFilter, UUIDFilter):
pass
class CharInFilter(BaseInFilter, CharFilter):
pass
class ChoiceInFilter(BaseInFilter, ChoiceFilter):
pass
class TenantFilter(FilterSet):
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
class Meta:
model = Tenant
fields = {
"name": ["exact", "icontains"],
"inserted_at": ["date", "gte", "lte"],
"updated_at": ["gte", "lte"],
}
class MembershipFilter(FilterSet):
date_joined = DateFilter(field_name="date_joined", lookup_expr="date")
role = ChoiceFilter(choices=Membership.RoleChoices.choices)
class Meta:
model = Membership
fields = {
"tenant": ["exact"],
"role": ["exact"],
"date_joined": ["date", "gte", "lte"],
}
class ProviderFilter(FilterSet):
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
connected = BooleanFilter()
provider = ChoiceFilter(choices=Provider.ProviderChoices.choices)
class Meta:
model = Provider
fields = {
"provider": ["exact", "in"],
"id": ["exact", "in"],
"uid": ["exact", "icontains", "in"],
"alias": ["exact", "icontains", "in"],
"inserted_at": ["gte", "lte"],
"updated_at": ["gte", "lte"],
}
filter_overrides = {
ProviderEnumField: {
"filter_class": CharFilter,
},
}
class ProviderRelationshipFilterSet(FilterSet):
provider_type = ChoiceFilter(
choices=Provider.ProviderChoices.choices, field_name="provider__provider"
)
provider_type__in = ChoiceInFilter(
choices=Provider.ProviderChoices.choices, field_name="provider__provider"
)
provider_uid = CharFilter(field_name="provider__uid", lookup_expr="exact")
provider_uid__in = CharInFilter(field_name="provider__uid", lookup_expr="in")
provider_uid__icontains = CharFilter(
field_name="provider__uid", lookup_expr="icontains"
)
provider_alias = CharFilter(field_name="provider__alias", lookup_expr="exact")
provider_alias__in = CharInFilter(field_name="provider__alias", lookup_expr="in")
provider_alias__icontains = CharFilter(
field_name="provider__alias", lookup_expr="icontains"
)
class ProviderGroupFilter(FilterSet):
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
class Meta:
model = ProviderGroup
fields = {
"id": ["exact", "in"],
"name": ["exact", "in"],
"inserted_at": ["gte", "lte"],
"updated_at": ["gte", "lte"],
}
class ScanFilter(ProviderRelationshipFilterSet):
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
completed_at = DateFilter(field_name="completed_at", lookup_expr="date")
started_at = DateFilter(field_name="started_at", lookup_expr="date")
next_scan_at = DateFilter(field_name="next_scan_at", lookup_expr="date")
trigger = ChoiceFilter(choices=Scan.TriggerChoices.choices)
state = ChoiceFilter(choices=StateChoices.choices)
state__in = ChoiceInFilter(
field_name="state", choices=StateChoices.choices, lookup_expr="in"
)
class Meta:
model = Scan
fields = {
"provider": ["exact", "in"],
"name": ["exact", "icontains"],
"started_at": ["gte", "lte"],
"next_scan_at": ["gte", "lte"],
"trigger": ["exact"],
}
class TaskFilter(FilterSet):
name = CharFilter(field_name="task_runner_task__task_name", lookup_expr="exact")
name__icontains = CharFilter(
field_name="task_runner_task__task_name", lookup_expr="icontains"
)
state = ChoiceFilter(
choices=StateChoices.choices, method="filter_state", lookup_expr="exact"
)
task_state_inverse_mapping_values = {
v: k for k, v in TaskBase.state_mapping.items()
}
def filter_state(self, queryset, name, value):
if value not in StateChoices:
raise ValidationError(
f"Invalid provider value: '{value}'. Valid values are: "
f"{', '.join(StateChoices)}"
)
return queryset.filter(
task_runner_task__status=self.task_state_inverse_mapping_values[value]
)
class Meta:
model = Task
fields = []
class ResourceTagFilter(FilterSet):
class Meta:
model = ResourceTag
fields = {
"key": ["exact", "icontains"],
"value": ["exact", "icontains"],
}
search = ["text_search"]
class ResourceFilter(ProviderRelationshipFilterSet):
tag_key = CharFilter(method="filter_tag_key")
tag_value = CharFilter(method="filter_tag_value")
tag = CharFilter(method="filter_tag")
tags = CharFilter(method="filter_tag")
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
class Meta:
model = Resource
fields = {
"provider": ["exact", "in"],
"uid": ["exact", "icontains"],
"name": ["exact", "icontains"],
"region": ["exact", "icontains", "in"],
"service": ["exact", "icontains", "in"],
"type": ["exact", "icontains", "in"],
"inserted_at": ["gte", "lte"],
"updated_at": ["gte", "lte"],
}
def filter_tag_key(self, queryset, name, value):
return queryset.filter(Q(tags__key=value) | Q(tags__key__icontains=value))
def filter_tag_value(self, queryset, name, value):
return queryset.filter(Q(tags__value=value) | Q(tags__value__icontains=value))
def filter_tag(self, queryset, name, value):
# We won't know what the user wants to filter on just based on the value,
# and we don't want to build special filtering logic for every possible
# provider tag spec, so we'll just do a full text search
return queryset.filter(tags__text_search=value)
class FindingFilter(FilterSet):
# We filter providers from the scan in findings
provider = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
provider__in = UUIDInFilter(field_name="scan__provider__id", lookup_expr="in")
provider_type = ChoiceFilter(
choices=Provider.ProviderChoices.choices, field_name="scan__provider__provider"
)
provider_type__in = ChoiceInFilter(
choices=Provider.ProviderChoices.choices, field_name="scan__provider__provider"
)
provider_uid = CharFilter(field_name="scan__provider__uid", lookup_expr="exact")
provider_uid__in = CharInFilter(field_name="scan__provider__uid", lookup_expr="in")
provider_uid__icontains = CharFilter(
field_name="scan__provider__uid", lookup_expr="icontains"
)
provider_alias = CharFilter(field_name="scan__provider__alias", lookup_expr="exact")
provider_alias__in = CharInFilter(
field_name="scan__provider__alias", lookup_expr="in"
)
provider_alias__icontains = CharFilter(
field_name="scan__provider__alias", lookup_expr="icontains"
)
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
uid = CharFilter(field_name="uid")
delta = ChoiceFilter(choices=Finding.DeltaChoices.choices)
status = ChoiceFilter(choices=StatusChoices.choices)
severity = ChoiceFilter(choices=SeverityChoices)
impact = ChoiceFilter(choices=SeverityChoices)
muted = BooleanFilter(
help_text="If this filter is not provided, muted and non-muted findings will be returned."
)
resources = UUIDInFilter(field_name="resource__id", lookup_expr="in")
region = CharFilter(field_name="resources__region")
region__in = CharInFilter(field_name="resources__region", lookup_expr="in")
region__icontains = CharFilter(
field_name="resources__region", lookup_expr="icontains"
)
service = CharFilter(field_name="resources__service")
service__in = CharInFilter(field_name="resources__service", lookup_expr="in")
service__icontains = CharFilter(
field_name="resources__service", lookup_expr="icontains"
)
resource_uid = CharFilter(field_name="resources__uid")
resource_uid__in = CharInFilter(field_name="resources__uid", lookup_expr="in")
resource_uid__icontains = CharFilter(
field_name="resources__uid", lookup_expr="icontains"
)
resource_name = CharFilter(field_name="resources__name")
resource_name__in = CharInFilter(field_name="resources__name", lookup_expr="in")
resource_name__icontains = CharFilter(
field_name="resources__name", lookup_expr="icontains"
)
resource_type = CharFilter(field_name="resources__type")
resource_type__in = CharInFilter(field_name="resources__type", lookup_expr="in")
resource_type__icontains = CharFilter(
field_name="resources__type", lookup_expr="icontains"
)
# Temporarily disabled until we implement tag filtering in the UI
# resource_tag_key = CharFilter(field_name="resources__tags__key")
# resource_tag_key__in = CharInFilter(
# field_name="resources__tags__key", lookup_expr="in"
# )
# resource_tag_key__icontains = CharFilter(
# field_name="resources__tags__key", lookup_expr="icontains"
# )
# resource_tag_value = CharFilter(field_name="resources__tags__value")
# resource_tag_value__in = CharInFilter(
# field_name="resources__tags__value", lookup_expr="in"
# )
# resource_tag_value__icontains = CharFilter(
# field_name="resources__tags__value", lookup_expr="icontains"
# )
# resource_tags = CharInFilter(
# method="filter_resource_tag",
# lookup_expr="in",
# help_text="Filter by resource tags `key:value` pairs.\nMultiple values may be "
# "separated by commas.",
# )
scan = UUIDFilter(method="filter_scan_id")
scan__in = UUIDInFilter(method="filter_scan_id_in")
inserted_at = DateFilter(method="filter_inserted_at", lookup_expr="date")
inserted_at__date = DateFilter(method="filter_inserted_at", lookup_expr="date")
inserted_at__gte = DateFilter(
method="filter_inserted_at_gte",
help_text=f"Maximum date range is {settings.FINDINGS_MAX_DAYS_IN_RANGE} days.",
)
inserted_at__lte = DateFilter(
method="filter_inserted_at_lte",
help_text=f"Maximum date range is {settings.FINDINGS_MAX_DAYS_IN_RANGE} days.",
)
class Meta:
model = Finding
fields = {
"id": ["exact", "in"],
"uid": ["exact", "in"],
"scan": ["exact", "in"],
"delta": ["exact", "in"],
"status": ["exact", "in"],
"severity": ["exact", "in"],
"impact": ["exact", "in"],
"check_id": ["exact", "in", "icontains"],
"inserted_at": ["date", "gte", "lte"],
"updated_at": ["gte", "lte"],
}
filter_overrides = {
FindingDeltaEnumField: {
"filter_class": CharFilter,
},
StatusEnumField: {
"filter_class": CharFilter,
},
SeverityEnumField: {
"filter_class": CharFilter,
},
}
def filter_queryset(self, queryset):
if not (self.data.get("scan") or self.data.get("scan__in")) and not (
self.data.get("inserted_at")
or self.data.get("inserted_at__date")
or self.data.get("inserted_at__gte")
or self.data.get("inserted_at__lte")
):
raise ValidationError(
[
{
"detail": "At least one date filter is required: filter[inserted_at], filter[inserted_at.gte], "
"or filter[inserted_at.lte].",
"status": 400,
"source": {"pointer": "/data/attributes/inserted_at"},
"code": "required",
}
]
)
gte_date = (
datetime.strptime(self.data.get("inserted_at__gte"), "%Y-%m-%d").date()
if self.data.get("inserted_at__gte")
else datetime.now(timezone.utc).date()
)
lte_date = (
datetime.strptime(self.data.get("inserted_at__lte"), "%Y-%m-%d").date()
if self.data.get("inserted_at__lte")
else datetime.now(timezone.utc).date()
)
if abs(lte_date - gte_date) > timedelta(
days=settings.FINDINGS_MAX_DAYS_IN_RANGE
):
raise ValidationError(
[
{
"detail": f"The date range cannot exceed {settings.FINDINGS_MAX_DAYS_IN_RANGE} days.",
"status": 400,
"source": {"pointer": "/data/attributes/inserted_at"},
"code": "invalid",
}
]
)
return super().filter_queryset(queryset)
# Convert filter values to UUIDv7 values for use with partitioning
def filter_scan_id(self, queryset, name, value):
try:
value_uuid = transform_into_uuid7(value)
start = uuid7_start(value_uuid)
end = uuid7_end(value_uuid, settings.FINDINGS_TABLE_PARTITION_MONTHS)
except ValidationError as validation_error:
detail = str(validation_error.detail[0])
raise ValidationError(
[
{
"detail": detail,
"status": 400,
"source": {"pointer": "/data/relationships/scan"},
"code": "invalid",
}
]
)
return (
queryset.filter(id__gte=start).filter(id__lt=end).filter(scan_id=value_uuid)
)
def filter_scan_id_in(self, queryset, name, value):
try:
uuid_list = [
transform_into_uuid7(value_uuid)
for value_uuid in value
if value_uuid is not None
]
start, end = uuid7_range(uuid_list)
except ValidationError as validation_error:
detail = str(validation_error.detail[0])
raise ValidationError(
[
{
"detail": detail,
"status": 400,
"source": {"pointer": "/data/relationships/scan"},
"code": "invalid",
}
]
)
if start == end:
return queryset.filter(id__gte=start).filter(scan_id__in=uuid_list)
else:
return (
queryset.filter(id__gte=start)
.filter(id__lt=end)
.filter(scan_id__in=uuid_list)
)
def filter_inserted_at(self, queryset, name, value):
datetime_value = self.maybe_date_to_datetime(value)
start = uuid7_start(datetime_to_uuid7(datetime_value))
end = uuid7_start(datetime_to_uuid7(datetime_value + timedelta(days=1)))
return queryset.filter(id__gte=start, id__lt=end)
def filter_inserted_at_gte(self, queryset, name, value):
datetime_value = self.maybe_date_to_datetime(value)
start = uuid7_start(datetime_to_uuid7(datetime_value))
return queryset.filter(id__gte=start)
def filter_inserted_at_lte(self, queryset, name, value):
datetime_value = self.maybe_date_to_datetime(value)
end = uuid7_start(datetime_to_uuid7(datetime_value + timedelta(days=1)))
return queryset.filter(id__lt=end)
def filter_resource_tag(self, queryset, name, value):
overall_query = Q()
for key_value_pair in value:
tag_key, tag_value = key_value_pair.split(":", 1)
overall_query |= Q(
resources__tags__key__icontains=tag_key,
resources__tags__value__icontains=tag_value,
)
return queryset.filter(overall_query).distinct()
@staticmethod
def maybe_date_to_datetime(value):
dt = value
if isinstance(value, date):
dt = datetime.combine(value, datetime.min.time(), tzinfo=timezone.utc)
return dt
class ProviderSecretFilter(FilterSet):
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
provider = UUIDFilter(field_name="provider__id", lookup_expr="exact")
class Meta:
model = ProviderSecret
fields = {
"name": ["exact", "icontains"],
}
class InvitationFilter(FilterSet):
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
expires_at = DateFilter(field_name="expires_at", lookup_expr="date")
state = ChoiceFilter(choices=Invitation.State.choices)
state__in = ChoiceInFilter(choices=Invitation.State.choices, lookup_expr="in")
class Meta:
model = Invitation
fields = {
"email": ["exact", "icontains"],
"inserted_at": ["date", "gte", "lte"],
"updated_at": ["date", "gte", "lte"],
"expires_at": ["date", "gte", "lte"],
"inviter": ["exact"],
}
filter_overrides = {
InvitationStateEnumField: {
"filter_class": CharFilter,
}
}
class UserFilter(FilterSet):
date_joined = DateFilter(field_name="date_joined", lookup_expr="date")
class Meta:
model = User
fields = {
"name": ["exact", "icontains"],
"email": ["exact", "icontains"],
"company_name": ["exact", "icontains"],
"date_joined": ["date", "gte", "lte"],
"is_active": ["exact"],
}
class RoleFilter(FilterSet):
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
permission_state = ChoiceFilter(
choices=PermissionChoices.choices, method="filter_permission_state"
)
def filter_permission_state(self, queryset, name, value):
return Role.filter_by_permission_state(queryset, value)
class Meta:
model = Role
fields = {
"id": ["exact", "in"],
"name": ["exact", "in"],
"inserted_at": ["gte", "lte"],
"updated_at": ["gte", "lte"],
}
class ComplianceOverviewFilter(FilterSet):
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
provider_type = ChoiceFilter(choices=Provider.ProviderChoices.choices)
provider_type__in = ChoiceInFilter(choices=Provider.ProviderChoices.choices)
scan_id = UUIDFilter(field_name="scan__id")
class Meta:
model = ComplianceOverview
fields = {
"inserted_at": ["date", "gte", "lte"],
"compliance_id": ["exact", "icontains"],
"framework": ["exact", "iexact", "icontains"],
"version": ["exact", "icontains"],
"region": ["exact", "icontains", "in"],
}
class ScanSummaryFilter(FilterSet):
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
provider_id = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
provider_type = ChoiceFilter(
field_name="scan__provider__provider", choices=Provider.ProviderChoices.choices
)
provider_type__in = ChoiceInFilter(
field_name="scan__provider__provider", choices=Provider.ProviderChoices.choices
)
region = CharFilter(field_name="region")
class Meta:
model = ScanSummary
fields = {
"inserted_at": ["date", "gte", "lte"],
"region": ["exact", "icontains", "in"],
}
class ServiceOverviewFilter(ScanSummaryFilter):
def is_valid(self):
# Check if at least one of the inserted_at filters is present
inserted_at_filters = [
self.data.get("inserted_at"),
self.data.get("inserted_at__gte"),
self.data.get("inserted_at__lte"),
]
if not any(inserted_at_filters):
raise ValidationError(
{
"inserted_at": [
"At least one of filter[inserted_at], filter[inserted_at__gte], or "
"filter[inserted_at__lte] is required."
]
}
)
return super().is_valid()
class IntegrationFilter(FilterSet):
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
integration_type = ChoiceFilter(choices=Integration.IntegrationChoices.choices)
integration_type__in = ChoiceInFilter(
choices=Integration.IntegrationChoices.choices,
field_name="integration_type",
lookup_expr="in",
)
class Meta:
model = Integration
fields = {
"inserted_at": ["date", "gte", "lte"],
}

View File

@@ -1,28 +0,0 @@
[
{
"model": "api.user",
"pk": "8b38e2eb-6689-4f1e-a4ba-95b275130200",
"fields": {
"password": "pbkdf2_sha256$720000$vA62S78kog2c2ytycVQdke$Fp35GVLLMyy5fUq3krSL9I02A+ocQ+RVa4S22LIAO5s=",
"last_login": null,
"name": "Devie Prowlerson",
"email": "dev@prowler.com",
"company_name": "Prowler Developers",
"is_active": true,
"date_joined": "2024-09-17T09:04:20.850Z"
}
},
{
"model": "api.user",
"pk": "b6493a3a-c997-489b-8b99-278bf74de9f6",
"fields": {
"password": "pbkdf2_sha256$720000$vA62S78kog2c2ytycVQdke$Fp35GVLLMyy5fUq3krSL9I02A+ocQ+RVa4S22LIAO5s=",
"last_login": null,
"name": "Devietoo Prowlerson",
"email": "dev2@prowler.com",
"company_name": "Prowler Developers",
"is_active": true,
"date_joined": "2024-09-18T09:04:20.850Z"
}
}
]

View File

@@ -1,50 +0,0 @@
[
{
"model": "api.tenant",
"pk": "12646005-9067-4d2a-a098-8bb378604362",
"fields": {
"inserted_at": "2024-03-21T23:00:00Z",
"updated_at": "2024-03-21T23:00:00Z",
"name": "Tenant1"
}
},
{
"model": "api.tenant",
"pk": "0412980b-06e3-436a-ab98-3c9b1d0333d3",
"fields": {
"inserted_at": "2024-03-21T23:00:00Z",
"updated_at": "2024-03-21T23:00:00Z",
"name": "Tenant2"
}
},
{
"model": "api.membership",
"pk": "2b0db93a-7e0b-4edf-a851-ea448676b7eb",
"fields": {
"user": "8b38e2eb-6689-4f1e-a4ba-95b275130200",
"tenant": "0412980b-06e3-436a-ab98-3c9b1d0333d3",
"role": "owner",
"date_joined": "2024-09-19T11:03:59.712Z"
}
},
{
"model": "api.membership",
"pk": "797d7cee-abc9-4598-98bb-4bf4bfb97f27",
"fields": {
"user": "8b38e2eb-6689-4f1e-a4ba-95b275130200",
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"role": "owner",
"date_joined": "2024-09-19T11:02:59.712Z"
}
},
{
"model": "api.membership",
"pk": "dea37563-7009-4dcf-9f18-25efb41462a7",
"fields": {
"user": "b6493a3a-c997-489b-8b99-278bf74de9f6",
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"role": "member",
"date_joined": "2024-09-19T11:03:59.712Z"
}
}
]

View File

@@ -1,193 +0,0 @@
[
{
"model": "api.provider",
"pk": "37b065f8-26b0-4218-a665-0b23d07b27d9",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-08-01T17:20:27.050Z",
"updated_at": "2024-08-01T17:20:27.050Z",
"provider": "gcp",
"uid": "a12322-test321",
"alias": "gcp_testing_2",
"connected": null,
"connection_last_checked_at": null,
"metadata": {}
}
},
{
"model": "api.provider",
"pk": "8851db6b-42e5-4533-aa9e-30a32d67e875",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-08-01T17:19:42.453Z",
"updated_at": "2024-08-01T17:19:42.453Z",
"provider": "gcp",
"uid": "a12345-test123",
"alias": "gcp_testing_1",
"connected": null,
"connection_last_checked_at": null,
"metadata": {}
}
},
{
"model": "api.provider",
"pk": "b85601a8-4b45-4194-8135-03fb980ef428",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-08-01T17:19:09.556Z",
"updated_at": "2024-08-01T17:19:09.556Z",
"provider": "aws",
"uid": "123456789020",
"alias": "aws_testing_2",
"connected": null,
"connection_last_checked_at": null,
"metadata": {}
}
},
{
"model": "api.provider",
"pk": "baa7b895-8bac-4f47-b010-4226d132856e",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-08-01T17:20:16.962Z",
"updated_at": "2024-08-01T17:20:16.962Z",
"provider": "gcp",
"uid": "a12322-test123",
"alias": "gcp_testing_3",
"connected": null,
"connection_last_checked_at": null,
"metadata": {}
}
},
{
"model": "api.provider",
"pk": "d7c7ea89-d9af-423b-a364-1290dcad5a01",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-08-01T17:18:58.132Z",
"updated_at": "2024-08-01T17:18:58.132Z",
"provider": "aws",
"uid": "123456789015",
"alias": "aws_testing_1",
"connected": null,
"connection_last_checked_at": null,
"metadata": {}
}
},
{
"model": "api.provider",
"pk": "1b59e032-3eb6-4694-93a5-df84cd9b3ce2",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-08-06T16:03:26.176Z",
"updated_at": "2024-08-06T16:03:26.176Z",
"provider": "azure",
"uid": "8851db6b-42e5-4533-aa9e-30a32d67e875",
"alias": "azure_testing",
"connected": null,
"connection_last_checked_at": null,
"metadata": {},
"scanner_args": {}
}
},
{
"model": "api.provider",
"pk": "26e55a24-cb2c-4cef-ac87-6f91fddb2c97",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-08-06T16:03:07.037Z",
"updated_at": "2024-08-06T16:03:07.037Z",
"provider": "kubernetes",
"uid": "kubernetes-test-12345",
"alias": "k8s_testing",
"connected": null,
"connection_last_checked_at": null,
"metadata": {},
"scanner_args": {}
}
},
{
"model": "api.provider",
"pk": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:45:26.352Z",
"updated_at": "2024-10-18T11:16:23.533Z",
"provider": "aws",
"uid": "106908755759",
"alias": "real testing aws provider",
"connected": true,
"connection_last_checked_at": "2024-10-18T11:16:23.503Z",
"metadata": {},
"scanner_args": {}
}
},
{
"model": "api.provider",
"pk": "7791914f-d646-4fe2-b2ed-73f2c6499a36",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:45:26.352Z",
"updated_at": "2024-10-18T11:16:23.533Z",
"provider": "kubernetes",
"uid": "gke_lucky-coast-419309_us-central1_autopilot-cluster-2",
"alias": "k8s_testing_2",
"connected": true,
"connection_last_checked_at": "2024-10-18T11:16:23.503Z",
"metadata": {},
"scanner_args": {}
}
},
{
"model": "api.providersecret",
"pk": "11491b47-75ae-4f71-ad8d-3e630a72182e",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-11T08:03:05.026Z",
"updated_at": "2024-10-11T08:04:47.033Z",
"name": "GCP static secrets",
"secret_type": "static",
"_secret": "Z0FBQUFBQm5DTndmZW9KakRZUHM2UHhQN2V3RzN0QmM1cERham8yMHp5cnVTT0lzdGFyS1FuVmJXUlpYSGsyU0cxR3RMMEdQYXlYMUVsaWtqLU1OZWlaVUp6OFREYlotZTVBY3BuTlZYbm9YcUJydzAxV2p5dkpLamI1Y2tUYzA0MmJUNWxsNTBRM0E1SDRCa0pPQWVlb05YU3dfeUhkLTRmOEh3dGczOGh1ZGhQcVdZdVAtYmtoSWlwNXM4VGFoVmF3dno2X1hrbk5GZjZTWjVuWEdEZUFXeHJSQjEzbTlVakhNdzYyWTdiVEpvUEc2MTNpRzUtczhEank1eGI0b3MyMlAyaGN6dlByZmtUWHByaDNUYWFqYS1tYnNBUkRKTzBacFNSRjFuVmd5bUtFUEJhd1ZVS1ZDd2xSUV9PaEtLTnc0XzVkY2lhM01WTjQwaWdJSk9wNUJSXzQ4RUNQLXFPNy1VdzdPYkZyWkVkU3RyQjVLTS1MVHN0R3k4THNKZ2NBNExaZnl3Q1EwN2dwNGRsUXptMjB0LXUzTUpzTDE2Q1hmS0ZSN2g1ZjBPeV8taFoxNUwxc2FEcktXX0dCM1IzeUZTTHNiTmNxVXBvNWViZTJScUVWV2VYTFQ4UHlid21PY1A0UjdNMGtERkZCd0lLMlJENDMzMVZUM09DQ0twd1N3VHlZd09XLUctOWhYcFJIR1p5aUlZeEUzejc2dWRYdGNsd0xOODNqRUFEczhSTWNtWU0tdFZ1ZTExaHNHUVYtd0Zxdld1LTdKVUNINzlZTGdHODhKeVVpQmRZMHRUNTJRRWhwS1F1Y3I2X2Iwc0c1NHlXSVRLZWxreEt0dVRnOTZFMkptU2VMS1dWXzdVOVRzMUNUWXM2aFlxVDJXdGo3d2cxSVZGWlI2ZWhIZzZBcEl4bEJ6UnVHc0RYWVNHcjFZUHI5ZUYyWG9rSlo0QUVSUkFCX3h2UmtJUTFzVXJUZ25vTmk2VzdoTTNta05ucmNfTi0yR1ZxN1E2MnZJOVVKOGxmMXMzdHMxVndmSVhQbUItUHgtMVpVcHJwMU5JVHJLb0Y1aHV5OEEwS0kzQkEtcFJkdkRnWGxmZnprNFhndWg1TmQyd09yTFdTRmZ3d2ZvZFUtWXp4a2VYb3JjckFIcE13MDUzX0RHSnlzM0N2ZE5IRzJzMXFMc0k4MDRyTHdLZFlWOG9SaFF0LU43Ynd6VFlEcVNvdFZ0emJEVk10aEp4dDZFTFNFNzk0UUo2WTlVLWRGYm1fanZHaFZreHBIMmtzVjhyS0xPTk9fWHhiVTJHQXZwVlVuY3JtSjFUYUdHQzhEaHFNZXhwUHBmY0kxaUVrOHo4a0FYOTdpZVJDbFRvdFlQeWo3eFZHX1ZMZ1Myc3prU3o2c3o2eXNja1U4N0Y1T0d1REVjZFRGNTByUkgyemVCSjlQYkY2bmJ4YTZodHB0cUNzd2xZcENycUdsczBIaEZPbG1jVUlqNlM2cEE3aGpVaWswTzBDLVFGUHM5UHhvM09saWNtaDhaNVlsc3FZdktKeWlheDF5OGhTODE2N3JWamdTZG5Fa3JSQ2ZUSEVfRjZOZXdreXRZLTBZRFhleVFFeC1YUzc0cWhYeEhobGxvdnZ3Rm15WFlBWXp0dm1DeTA5eExLeEFRRXVRSXBXdTNEaWdZZ3JDenItdDhoZlFiTzI0SGZ1c01FR1FNaFVweVBKR1YxWGRUMW1Mc2JVdW9raWR6UHk2ZTBnS05pV3oyZVBjREdkY3k4ZHZPUWE5S281MkJRSHF3NnpTclZ5bl90bk1wUEh6Tkp5dXlDcE5paWRqcVhxRFVObWIzRldWOGJ2aC1CRHZpbFZrb0hjNGpCMm5POGRiS2lETUpMLUVfQlhCdTZPLW9USW1LTFlTSF9zRUJYZ1NKeFFEQjNOR215ZXJDbkFndmcxWl9rWlk9",
"provider": "8851db6b-42e5-4533-aa9e-30a32d67e875"
}
},
{
"model": "api.providersecret",
"pk": "40191ad5-d8c2-40a9-826d-241397626b68",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-10T11:11:44.515Z",
"updated_at": "2024-10-11T07:59:56.102Z",
"name": "AWS static secrets",
"secret_type": "static",
"_secret": "Z0FBQUFBQm5DTnI4Y1RyV19UWEJzc3kzQUExcU5tdlQzbFVLeDdZMWd1MzkwWkl2UF9oZGhiVEJHVWpSMXV4MjYyN3g2OVpvNVpkQUQ3S0VGaGdQLTFhQWE3MkpWZUt2cnVhODc4d3FpY3FVZkpwdHJzNUJPeFRwZ3N4bGpPZTlkNWRNdFlwTHU3aTNWR3JjSzJwLWRITHdfQWpXb1F0c1l3bVFxbnFrTEpPTGgxcnF1VUprSzZ5dGRQU2VGYmZhTTlwbVpsNFBNWlFhVW9RbjJyYnZ5N0oweE5kV0ZEaUdpUUpNVExOa3oyQ2dNREVSenJ0TEFZc0RrRWpXNUhyMmtybGNLWDVOR0FabEl4QVR1bkZyb2hBLWc1MFNIekVyeXI0SmVreHBjRnJ1YUlVdXpVbW9JZkk0aEgxYlM1VGhSRlhtcS14YzdTYUhXR2xodElmWjZuNUVwaHozX1RVTG1QWHdPZWd4clNHYnAyOTBsWEl5UU83RGxZb0RKWjdadjlsTmJtSHQ0Yl9uaDJoODB0QV9sWmFYbFAxcjA1bmhNVlNqc2xEeHlvcUJFbVZvY250ZENnMnZLT1psb1JDclB3WVR6NGdZb2pzb3U4Ny04QlB0UTZub0dMOXZEUTZEcVJhZldCWEZZSDdLTy02UVZqck5zVTZwS3pObGlOejNJeHUzbFRabFM2V2xaekZVRjZtX3VzZlplendnOWQzT01WMFd3ejNadHVlTFlqRGR2dk5Da29zOFYwOUdOaEc4OHhHRnJFMmJFMk12VDNPNlBBTGlsXy13cUM1QkVYb0o1Z2U4ZXJnWXpZdm1sWjA5bzQzb2NFWC1xbmIycGZRbGtCaGNaOWlkX094UUNNampwbkZoREctNWI4QnZRaE8zM3BEQ1BwNzA1a3BzOGczZXdIM2s1NHFGN1ZTbmJhZkc4RVdfM0ZIZU5udTBYajd1RGxpWXZpRWdSMmhHa2RKOEIzbmM0X2F1OGxrN2p6LW9UVldDOFVpREoxZ1UzcTBZX19OQ0xJb0syWlhNSlQ4MzQwdzRtVG94Y01GS3FMLV95UVlxOTFORk8zdjE5VGxVaXdhbGlzeHdoYWNzazZWai1GUGtUM2gzR0ZWTTY4SThWeVFnZldIaklOTTJqTTg1VkhEYW5wNmdEVllXMmJCV2tpVmVYeUV2c0E1T00xbHJRNzgzVG9wb0Q1cV81UEhqYUFsQ2p1a0VpRDVINl9SVkpyZVRNVnVXQUxwY3NWZnJrNmRVREpiLWNHYUpXWmxkQlhNbWhuR1NmQ1BaVDlidUxCWHJMaHhZbk1FclVBaEVZeWg1ZlFoenZzRHlKbV8wa3lmMGZrd3NmTDZjQkE0UXNSUFhpTWtUUHBrX29BVzc4QzEtWEJIQW1GMGFuZVlXQWZIOXJEamloeGFCeHpYMHNjMFVfNXpQdlJfSkk2bzFROU5NU0c1SHREWW1nbkFNZFZ0UjdPRGdjaF96RGplY1hjdFFzLVR6MTVXYlRjbHIxQ2JRejRpVko5NWhBU0ZHR3ZvczU5elljRGpHRTdIc0FsSm5fUHEwT1gtTS1lN3M3X3ZZRnlkYUZoZXRQeEJsZlhLdFdTUzU1NUl4a29aOWZIdTlPM0Fnak1xYWVkYTNiMmZXUHlXS2lwUVBZLXQyaUxuRmtQNFFieE9SVmdZVW9WTHlzbnBPZlNIdGVHOE1LNVNESjN3cGtVSHVpT1NJWHE1ZzNmUTVTOC0xX3NGSmJqU19IbjZfQWtMRG1YNUQtRy13TUJIZFlyOXJkQzFQbkdZVXVzM2czbS1HWHFBT1pXdVd3N09tcG82SVhnY1ZtUWxqTEg2UzJCUmllb2pweVN2aGwwS1FVRUhjNEN2amRMc3MwVU4zN3dVMWM5Slg4SERtenFaQk1yMWx0LWtxVWtLZVVtbU4yejVEM2h6TEt0RGdfWE09",
"provider": "b85601a8-4b45-4194-8135-03fb980ef428"
}
},
{
"model": "api.providersecret",
"pk": "ed89d1ea-366a-4d12-a602-f2ab77019742",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-10T11:11:44.515Z",
"updated_at": "2024-10-11T07:59:56.102Z",
"name": "Azure static secrets",
"secret_type": "static",
"_secret": "Z0FBQUFBQm5DTnI4Y1RyV19UWEJzc3kzQUExcU5tdlQzbFVLeDdZMWd1MzkwWkl2UF9oZGhiVEJHVWpSMXV4MjYyN3g2OVpvNVpkQUQ3S0VGaGdQLTFhQWE3MkpWZUt2cnVhODc4d3FpY3FVZkpwdHJzNUJPeFRwZ3N4bGpPZTlkNWRNdFlwTHU3aTNWR3JjSzJwLWRITHdfQWpXb1F0c1l3bVFxbnFrTEpPTGgxcnF1VUprSzZ5dGRQU2VGYmZhTTlwbVpsNFBNWlFhVW9RbjJyYnZ5N0oweE5kV0ZEaUdpUUpNVExOa3oyQ2dNREVSenJ0TEFZc0RrRWpXNUhyMmtybGNLWDVOR0FabEl4QVR1bkZyb2hBLWc1MFNIekVyeXI0SmVreHBjRnJ1YUlVdXpVbW9JZkk0aEgxYlM1VGhSRlhtcS14YzdTYUhXR2xodElmWjZuNUVwaHozX1RVTG1QWHdPZWd4clNHYnAyOTBsWEl5UU83RGxZb0RKWjdadjlsTmJtSHQ0Yl9uaDJoODB0QV9sWmFYbFAxcjA1bmhNVlNqc2xEeHlvcUJFbVZvY250ZENnMnZLT1psb1JDclB3WVR6NGdZb2pzb3U4Ny04QlB0UTZub0dMOXZEUTZEcVJhZldCWEZZSDdLTy02UVZqck5zVTZwS3pObGlOejNJeHUzbFRabFM2V2xaekZVRjZtX3VzZlplendnOWQzT01WMFd3ejNadHVlTFlqRGR2dk5Da29zOFYwOUdOaEc4OHhHRnJFMmJFMk12VDNPNlBBTGlsXy13cUM1QkVYb0o1Z2U4ZXJnWXpZdm1sWjA5bzQzb2NFWC1xbmIycGZRbGtCaGNaOWlkX094UUNNampwbkZoREctNWI4QnZRaE8zM3BEQ1BwNzA1a3BzOGczZXdIM2s1NHFGN1ZTbmJhZkc4RVdfM0ZIZU5udTBYajd1RGxpWXZpRWdSMmhHa2RKOEIzbmM0X2F1OGxrN2p6LW9UVldDOFVpREoxZ1UzcTBZX19OQ0xJb0syWlhNSlQ4MzQwdzRtVG94Y01GS3FMLV95UVlxOTFORk8zdjE5VGxVaXdhbGlzeHdoYWNzazZWai1GUGtUM2gzR0ZWTTY4SThWeVFnZldIaklOTTJqTTg1VkhEYW5wNmdEVllXMmJCV2tpVmVYeUV2c0E1T00xbHJRNzgzVG9wb0Q1cV81UEhqYUFsQ2p1a0VpRDVINl9SVkpyZVRNVnVXQUxwY3NWZnJrNmRVREpiLWNHYUpXWmxkQlhNbWhuR1NmQ1BaVDlidUxCWHJMaHhZbk1FclVBaEVZeWg1ZlFoenZzRHlKbV8wa3lmMGZrd3NmTDZjQkE0UXNSUFhpTWtUUHBrX29BVzc4QzEtWEJIQW1GMGFuZVlXQWZIOXJEamloeGFCeHpYMHNjMFVfNXpQdlJfSkk2bzFROU5NU0c1SHREWW1nbkFNZFZ0UjdPRGdjaF96RGplY1hjdFFzLVR6MTVXYlRjbHIxQ2JRejRpVko5NWhBU0ZHR3ZvczU5elljRGpHRTdIc0FsSm5fUHEwT1gtTS1lN3M3X3ZZRnlkYUZoZXRQeEJsZlhLdFdTUzU1NUl4a29aOWZIdTlPM0Fnak1xYWVkYTNiMmZXUHlXS2lwUVBZLXQyaUxuRmtQNFFieE9SVmdZVW9WTHlzbnBPZlNIdGVHOE1LNVNESjN3cGtVSHVpT1NJWHE1ZzNmUTVTOC0xX3NGSmJqU19IbjZfQWtMRG1YNUQtRy13TUJIZFlyOXJkQzFQbkdZVXVzM2czbS1HWHFBT1pXdVd3N09tcG82SVhnY1ZtUWxqTEg2UzJCUmllb2pweVN2aGwwS1FVRUhjNEN2amRMc3MwVU4zN3dVMWM5Slg4SERtenFaQk1yMWx0LWtxVWtLZVVtbU4yejVEM2h6TEt0RGdfWE09",
"provider": "1b59e032-3eb6-4694-93a5-df84cd9b3ce2"
}
},
{
"model": "api.providersecret",
"pk": "ae48ecde-75cd-4814-92ab-18f48719e5d9",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:45:26.412Z",
"updated_at": "2024-10-18T10:45:26.412Z",
"name": "Valid AWS Credentials",
"secret_type": "static",
"_secret": "Z0FBQUFBQm5FanhHa3dXS0I3M2NmWm56SktiaGNqdDZUN0xQU1QwUi15QkhLZldFUmRENk1BXzlscG9JSUxVSTF5ekxuMkdEanlJNjhPUS1VSV9wVTBvU2l4ZnNGOVJhYW93RC1LTEhmc2pyOTJvUWwyWnpFY19WN1pRQk5IdDYwYnBDQnF1eU9nUzdwTGU3QU5qMGFyX1E4SXdpSk9paGVLcVpOVUhwb3duaXgxZ0ZxME5Pcm40QzBGWEZKY2lmRVlCMGFuVFVzemxuVjVNalZVQ2JsY2ZqNWt3Z01IYUZ0dk92YkdtSUZ5SlBvQWZoVU5DWlRFWmExNnJGVEY4Q1Bnd2VJUW9TSWdRcG9rSDNfREQwRld3Q1RYVnVYWVJLWWIxZmpsWGpwd0xQM0dtLTlYUjdHOVhhNklLWXFGTHpFQUVyVmNhYW9CU0tocGVyX3VjMkVEcVdjdFBfaVpsLTBzaUxrWTlta3dpelNtTG9xYVhBUHUzNUE4RnI1WXdJdHcxcFVfaG1XRHhDVFBKamxJb1FaQ2lsQ3FzRmxZbEJVemVkT1E2aHZfbDJqWDJPT3ViOWJGYzQ3eTNWNlFQSHBWRDFiV2tneDM4SmVqMU9Bd01TaXhPY2dmWG5RdENURkM2b2s5V3luVUZQcnFKNldnWEdYaWE2MnVNQkEwMHd6cUY5cVJkcGw4bHBtNzhPeHhkREdwSXNEc1JqQkxUR1FYRTV0UFNwbVlVSWF5LWgtbVhJZXlPZ0Q4cG9HX2E0Qld0LTF1TTFEVy1XNGdnQTRpLWpQQmFJUEdaOFJGNDVoUVJnQ25YVU5DTENMaTY4YmxtYWJFRERXTjAydVN2YnBDb3RkUE0zSDRlN1A3TXc4d2h1Wmd0LWUzZEcwMUstNUw2YnFyS2Z0NEVYMXllQW5GLVBpeU55SkNhczFIeFhrWXZpVXdwSFVrTDdiQjQtWHZJdERXVThzSnJsT2FNZzJDaUt6Y2NXYUZhUlo3VkY0R1BrSHNHNHprTmxjYmp1TXVKakRha0VtNmRFZWRmZHJWdnRCOVNjVGFVWjVQM3RwWWl4SkNmOU1pb2xqMFdOblhNY3Y3aERpOHFlWjJRc2dtRDkzZm1Qc29wdk5OQmJPbGk5ZUpGM1I2YzRJN2gxR3FEMllXR1pma1k0emVqSjZyMUliMGZsc3NfSlVDbGt4QzJTc3hHOU9FRHlZb09zVnlvcDR6WC1uclRSenI0Yy13WlFWNzJWRkwydjhmSjFZdnZ5X3NmZVF6UWRNMXo5STVyV3B0d09UUlFtOURITGhXSDVIUl9zYURJc05KWUNxekVyYkxJclNFNV9leEk4R2xsMGJod3lYeFIwaXR2dllwLTZyNWlXdDRpRkxVYkxWZFdvYUhKck5aeElBZUtKejNKS2tYVW1rTnVrRjJBQmdlZmV6ckozNjNwRmxLS1FaZzRVTTBZYzFFYi1idjBpZkQ3bWVvbEdRZXJrWFNleWZmSmFNdG1wQlp0YmxjWDV5T0tEbHRsYnNHbjRPRjl5MkttOUhRWlJtd1pmTnY4Z1lPRlZoTzFGVDdTZ0RDY1ByV0RndTd5LUNhcHNXUnNIeXdLMEw3WS1tektRTWFLQy1zakpMLWFiM3FOakE1UWU4LXlOX2VPbmd4MTZCRk9OY3Z4UGVDSWxhRlg4eHI4X1VUTDZZM0pjV0JDVi1UUjlTUl85cm1LWlZ0T1dzU0lpdWUwbXgtZ0l6eHNSNExRTV9MczJ6UkRkVElnRV9Rc0RoTDFnVHRZSEFPb2paX200TzZiRzVmRE5hOW5CTjh5Qi1WaEtueEpqRzJDY1luVWZtX1pseUpQSE5lQ0RrZ05EbWo5cU9MZ0ZkcXlqUll4UUkyejRfY2p4RXdEeC1PS1JIQVNUcmNIdkRJbzRiUktMWEQxUFM3aGNzeVFWUDdtcm5xNHlOYUU9",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555"
}
}
]

View File

@@ -1,256 +0,0 @@
[
{
"model": "api.scan",
"pk": "0191e280-9d2f-71c8-9b18-487a23ba185e",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"provider": "37b065f8-26b0-4218-a665-0b23d07b27d9",
"trigger": "manual",
"name": "test scan 1",
"state": "completed",
"unique_resource_count": 1,
"duration": 5,
"scanner_args": {
"checks_to_execute": ["accessanalyzer_enabled"]
},
"inserted_at": "2024-09-01T17:25:27.050Z",
"started_at": "2024-09-01T17:25:27.050Z",
"updated_at": "2024-09-01T17:25:27.050Z",
"completed_at": "2024-09-01T17:25:32.050Z"
}
},
{
"model": "api.scan",
"pk": "01920573-aa9c-73c9-bcda-f2e35c9b19d2",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"provider": "b85601a8-4b45-4194-8135-03fb980ef428",
"trigger": "manual",
"name": "test aws scan 2",
"state": "completed",
"unique_resource_count": 1,
"duration": 20,
"scanner_args": {
"checks_to_execute": ["accessanalyzer_enabled"]
},
"inserted_at": "2024-09-02T17:24:27.050Z",
"started_at": "2024-09-02T17:24:27.050Z",
"updated_at": "2024-09-02T17:24:27.050Z",
"completed_at": "2024-09-01T17:24:37.050Z"
}
},
{
"model": "api.scan",
"pk": "01920573-ea5b-77fd-a93f-1ed2ae12f728",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"provider": "baa7b895-8bac-4f47-b010-4226d132856e",
"trigger": "manual",
"name": "test gcp scan",
"state": "completed",
"unique_resource_count": 10,
"duration": 10,
"scanner_args": {
"checks_to_execute": ["cloudsql_instance_automated_backups"]
},
"inserted_at": "2024-09-02T19:26:27.050Z",
"started_at": "2024-09-02T19:26:27.050Z",
"updated_at": "2024-09-02T19:26:27.050Z",
"completed_at": "2024-09-01T17:26:37.050Z"
}
},
{
"model": "api.scan",
"pk": "01920573-ea5b-77fd-a93f-1ed2ae12f728",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"provider": "b85601a8-4b45-4194-8135-03fb980ef428",
"trigger": "manual",
"name": "test aws scan",
"state": "completed",
"unique_resource_count": 1,
"duration": 35,
"scanner_args": {
"checks_to_execute": ["accessanalyzer_enabled"]
},
"inserted_at": "2024-09-02T19:27:27.050Z",
"started_at": "2024-09-02T19:27:27.050Z",
"updated_at": "2024-09-02T19:27:27.050Z",
"completed_at": "2024-09-01T17:27:37.050Z"
}
},
{
"model": "api.scan",
"pk": "c281c924-23f3-4fcc-ac63-73a22154b7de",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"provider": "b85601a8-4b45-4194-8135-03fb980ef428",
"trigger": "scheduled",
"name": "test scheduled aws scan",
"state": "available",
"scanner_args": {
"checks_to_execute": ["cloudformation_stack_outputs_find_secrets"]
},
"scheduled_at": "2030-09-02T19:20:27.050Z",
"inserted_at": "2024-09-02T19:24:27.050Z",
"updated_at": "2024-09-02T19:24:27.050Z"
}
},
{
"model": "api.scan",
"pk": "25c8907c-b26e-4ec0-966b-a1f53a39d8e6",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"provider": "b85601a8-4b45-4194-8135-03fb980ef428",
"trigger": "scheduled",
"name": "test scheduled aws scan 2",
"state": "available",
"scanner_args": {
"checks_to_execute": [
"accessanalyzer_enabled",
"cloudformation_stack_outputs_find_secrets"
]
},
"scheduled_at": "2030-08-02T19:31:27.050Z",
"inserted_at": "2024-09-02T19:38:27.050Z",
"updated_at": "2024-09-02T19:38:27.050Z"
}
},
{
"model": "api.scan",
"pk": "25c8907c-b26e-4ec0-966b-a1f53a39d8e6",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"provider": "baa7b895-8bac-4f47-b010-4226d132856e",
"trigger": "scheduled",
"name": "test scheduled gcp scan",
"state": "available",
"scanner_args": {
"checks_to_execute": [
"cloudsql_instance_automated_backups",
"iam_audit_logs_enabled"
]
},
"scheduled_at": "2030-07-02T19:30:27.050Z",
"inserted_at": "2024-09-02T19:29:27.050Z",
"updated_at": "2024-09-02T19:29:27.050Z"
}
},
{
"model": "api.scan",
"pk": "25c8907c-b26e-4ec0-966b-a1f53a39d8e6",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"provider": "1b59e032-3eb6-4694-93a5-df84cd9b3ce2",
"trigger": "scheduled",
"name": "test scheduled azure scan",
"state": "available",
"scanner_args": {
"checks_to_execute": [
"aks_cluster_rbac_enabled",
"defender_additional_email_configured_with_a_security_contact"
]
},
"scheduled_at": "2030-08-05T19:32:27.050Z",
"inserted_at": "2024-09-02T19:29:27.050Z",
"updated_at": "2024-09-02T19:29:27.050Z"
}
},
{
"model": "api.scan",
"pk": "01929f3b-ed2e-7623-ad63-7c37cd37828f",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"name": "real scan 1",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"trigger": "manual",
"state": "completed",
"unique_resource_count": 19,
"progress": 100,
"scanner_args": {
"checks_to_execute": ["accessanalyzer_enabled"]
},
"duration": 7,
"scheduled_at": null,
"inserted_at": "2024-10-18T10:45:57.678Z",
"updated_at": "2024-10-18T10:46:05.127Z",
"started_at": "2024-10-18T10:45:57.909Z",
"completed_at": "2024-10-18T10:46:05.127Z"
}
},
{
"model": "api.scan",
"pk": "6dd8925f-a52d-48de-a546-d2d90db30ab1",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"name": "real scan azure",
"provider": "1b59e032-3eb6-4694-93a5-df84cd9b3ce2",
"trigger": "manual",
"state": "completed",
"unique_resource_count": 20,
"progress": 100,
"scanner_args": {
"checks_to_execute": [
"accessanalyzer_enabled",
"account_security_contact_information_is_registered"
]
},
"duration": 4,
"scheduled_at": null,
"inserted_at": "2024-10-18T11:16:21.358Z",
"updated_at": "2024-10-18T11:16:26.060Z",
"started_at": "2024-10-18T11:16:21.593Z",
"completed_at": "2024-10-18T11:16:26.060Z"
}
},
{
"model": "api.scan",
"pk": "4ca7ce89-3236-41a8-a369-8937bc152af5",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"name": "real scan k8s",
"provider": "7791914f-d646-4fe2-b2ed-73f2c6499a36",
"trigger": "manual",
"state": "completed",
"unique_resource_count": 20,
"progress": 100,
"scanner_args": {
"checks_to_execute": [
"accessanalyzer_enabled",
"account_security_contact_information_is_registered"
]
},
"duration": 4,
"scheduled_at": null,
"inserted_at": "2024-10-18T11:16:21.358Z",
"updated_at": "2024-10-18T11:16:26.060Z",
"started_at": "2024-10-18T11:16:21.593Z",
"completed_at": "2024-10-18T11:16:26.060Z"
}
},
{
"model": "api.scan",
"pk": "01929f57-c0ee-7553-be0b-cbde006fb6f7",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"name": "real scan 2",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"trigger": "manual",
"state": "completed",
"unique_resource_count": 20,
"progress": 100,
"scanner_args": {
"checks_to_execute": [
"accessanalyzer_enabled",
"account_security_contact_information_is_registered"
]
},
"duration": 4,
"scheduled_at": null,
"inserted_at": "2024-10-18T11:16:21.358Z",
"updated_at": "2024-10-18T11:16:26.060Z",
"started_at": "2024-10-18T11:16:21.593Z",
"completed_at": "2024-10-18T11:16:26.060Z"
}
}
]

View File

@@ -1,322 +0,0 @@
[
{
"model": "api.resource",
"pk": "0234477d-0b8e-439f-87d3-ce38dff3a434",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.772Z",
"updated_at": "2024-10-18T11:16:24.466Z",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"uid": "arn:aws:iam::112233445566:root",
"name": "",
"region": "eu-south-2",
"service": "accessanalyzer",
"type": "Other",
"text_search": "'2':9C '112233445566':4A 'accessanalyzer':10 'arn':1A 'aws':2A 'eu':7C 'eu-south':6C 'iam':3A 'other':11 'root':5A 'south':8C"
}
},
{
"model": "api.resource",
"pk": "17ce30a3-6e77-42a5-bb08-29dfcad7396a",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.882Z",
"updated_at": "2024-10-18T11:16:24.533Z",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"uid": "arn:aws:iam::112233445566:root2",
"name": "",
"region": "eu-west-1",
"service": "accessanalyzer",
"type": "Other",
"text_search": "'1':9C '112233445566':4A 'accessanalyzer':10 'arn':1A 'aws':2A 'eu':7C 'eu-west':6C 'iam':3A 'other':11 'root':5A 'west':8C"
}
},
{
"model": "api.resource",
"pk": "1f9de587-ba5b-415a-b9b0-ceed4c6c9f32",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:05.091Z",
"updated_at": "2024-10-18T11:16:24.637Z",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"uid": "arn:aws:iam::112233445566:root3",
"name": "",
"region": "ap-northeast-2",
"service": "accessanalyzer",
"type": "Other",
"text_search": "'2':9C '112233445566':4A 'accessanalyzer':10 'ap':7C 'ap-northeast':6C 'arn':1A 'aws':2A 'iam':3A 'northeast':8C 'other':11 'root':5A"
}
},
{
"model": "api.resource",
"pk": "29b35668-6dad-411d-bfec-492311889892",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:05.008Z",
"updated_at": "2024-10-18T11:16:24.600Z",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"uid": "arn:aws:iam::112233445566:root4",
"name": "",
"region": "us-west-2",
"service": "accessanalyzer",
"type": "Other",
"text_search": "'2':9C '112233445566':4A 'accessanalyzer':10 'arn':1A 'aws':2A 'iam':3A 'other':11 'root':5A 'us':7C 'us-west':6C 'west':8C"
}
},
{
"model": "api.resource",
"pk": "30505514-01d4-42bb-8b0c-471bbab27460",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:26.014Z",
"updated_at": "2024-10-18T11:16:26.023Z",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"uid": "arn:aws:iam::112233445566:root5",
"name": "",
"region": "us-east-1",
"service": "account",
"type": "Other",
"text_search": "'1':9C '112233445566':4A 'account':10 'arn':1A 'aws':2A 'east':8C 'iam':3A 'other':11 'root':5A 'us':7C 'us-east':6C"
}
},
{
"model": "api.resource",
"pk": "372932f0-e4df-4968-9721-bb4f6236fae4",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.848Z",
"updated_at": "2024-10-18T11:16:24.516Z",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"uid": "arn:aws:iam::112233445566:root6",
"name": "",
"region": "eu-west-3",
"service": "accessanalyzer",
"type": "Other",
"text_search": "'3':9C '112233445566':4A 'accessanalyzer':10 'arn':1A 'aws':2A 'eu':7C 'eu-west':6C 'iam':3A 'other':11 'root':5A 'west':8C"
}
},
{
"model": "api.resource",
"pk": "3a37d124-7637-43f6-9df7-e9aa7ef98c53",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.979Z",
"updated_at": "2024-10-18T11:16:24.585Z",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"uid": "arn:aws:iam::112233445566:root7",
"name": "",
"region": "sa-east-1",
"service": "accessanalyzer",
"type": "Other",
"text_search": "'1':9C '112233445566':4A 'accessanalyzer':10 'arn':1A 'aws':2A 'east':8C 'iam':3A 'other':11 'root':5A 'sa':7C 'sa-east':6C"
}
},
{
"model": "api.resource",
"pk": "3c49318e-03c6-4f12-876f-40451ce7de3d",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:05.072Z",
"updated_at": "2024-10-18T11:16:24.630Z",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"uid": "arn:aws:iam::112233445566:root8",
"name": "",
"region": "ap-southeast-2",
"service": "accessanalyzer",
"type": "Other",
"text_search": "'2':9C '112233445566':4A 'accessanalyzer':10 'ap':7C 'ap-southeast':6C 'arn':1A 'aws':2A 'iam':3A 'other':11 'root':5A 'southeast':8C"
}
},
{
"model": "api.resource",
"pk": "430bf313-8733-4bc5-ac70-5402adfce880",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.994Z",
"updated_at": "2024-10-18T11:16:24.593Z",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"uid": "arn:aws:iam::112233445566:root9",
"name": "",
"region": "eu-north-1",
"service": "accessanalyzer",
"type": "Other",
"text_search": "'1':9C '112233445566':4A 'accessanalyzer':10 'arn':1A 'aws':2A 'eu':7C 'eu-north':6C 'iam':3A 'north':8C 'other':11 'root':5A"
}
},
{
"model": "api.resource",
"pk": "78bd2a52-82f9-45df-90a9-4ad78254fdc4",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:05.055Z",
"updated_at": "2024-10-18T11:16:24.622Z",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"uid": "arn:aws:iam::112233445566:root10",
"name": "",
"region": "ap-northeast-1",
"service": "accessanalyzer",
"type": "Other",
"text_search": "'1':9C '112233445566':4A 'accessanalyzer':10 'ap':7C 'ap-northeast':6C 'arn':1A 'aws':2A 'iam':3A 'northeast':8C 'other':11 'root':5A"
}
},
{
"model": "api.resource",
"pk": "7973e332-795e-4a74-b4d4-a53a21c98c80",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.896Z",
"updated_at": "2024-10-18T11:16:24.542Z",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"uid": "arn:aws:iam::112233445566:root11",
"name": "",
"region": "us-east-2",
"service": "accessanalyzer",
"type": "Other",
"text_search": "'2':9C '112233445566':4A 'accessanalyzer':10 'arn':1A 'aws':2A 'east':8C 'iam':3A 'other':11 'root':5A 'us':7C 'us-east':6C"
}
},
{
"model": "api.resource",
"pk": "8ca0a188-5699-436e-80fd-e566edaeb259",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.938Z",
"updated_at": "2024-10-18T11:16:24.565Z",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"uid": "arn:aws:iam::112233445566:root12",
"name": "",
"region": "ca-central-1",
"service": "accessanalyzer",
"type": "Other",
"text_search": "'1':9C '112233445566':4A 'accessanalyzer':10 'arn':1A 'aws':2A 'ca':7C 'ca-central':6C 'central':8C 'iam':3A 'other':11 'root':5A"
}
},
{
"model": "api.resource",
"pk": "8fe4514f-71d7-46ab-b0dc-70cef23b4d13",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.965Z",
"updated_at": "2024-10-18T11:16:24.578Z",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"uid": "arn:aws:iam::112233445566:root13",
"name": "",
"region": "eu-west-2",
"service": "accessanalyzer",
"type": "Other",
"text_search": "'2':9C '112233445566':4A 'accessanalyzer':10 'arn':1A 'aws':2A 'eu':7C 'eu-west':6C 'iam':3A 'other':11 'root':5A 'west':8C"
}
},
{
"model": "api.resource",
"pk": "9ab35225-dc7c-4ebd-bbc0-d81fb5d9de77",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.909Z",
"updated_at": "2024-10-18T11:16:24.549Z",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"uid": "arn:aws:iam::112233445566:root14",
"name": "",
"region": "ap-south-1",
"service": "accessanalyzer",
"type": "Other",
"text_search": "'1':9C '112233445566':4A 'accessanalyzer':10 'ap':7C 'ap-south':6C 'arn':1A 'aws':2A 'iam':3A 'other':11 'root':5A 'south':8C"
}
},
{
"model": "api.resource",
"pk": "9be26c1d-adf0-4ba8-9ca9-c740f4a0dc4e",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.863Z",
"updated_at": "2024-10-18T11:16:24.524Z",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"uid": "arn:aws:iam::112233445566:root15",
"name": "",
"region": "eu-central-2",
"service": "accessanalyzer",
"type": "Other",
"text_search": "'2':9C '112233445566':4A 'accessanalyzer':10 'arn':1A 'aws':2A 'central':8C 'eu':7C 'eu-central':6C 'iam':3A 'other':11 'root':5A"
}
},
{
"model": "api.resource",
"pk": "ba108c01-bcad-44f1-b211-c1d8985da89d",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:05.110Z",
"updated_at": "2024-10-18T11:16:24.644Z",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"uid": "arn:aws:iam::112233445566:root16",
"name": "",
"region": "ap-northeast-3",
"service": "accessanalyzer",
"type": "Other",
"text_search": "'3':9C '112233445566':4A 'accessanalyzer':10 'ap':7C 'ap-northeast':6C 'arn':1A 'aws':2A 'iam':3A 'northeast':8C 'other':11 'root':5A"
}
},
{
"model": "api.resource",
"pk": "dc6cfb5d-6835-4c7b-9152-c18c734a6eaa",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:05.038Z",
"updated_at": "2024-10-18T11:16:24.615Z",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"uid": "arn:aws:iam::112233445566:root17",
"name": "",
"region": "eu-central-1",
"service": "accessanalyzer",
"type": "Other",
"text_search": "'1':9C '112233445566':4A 'accessanalyzer':10 'arn':1A 'aws':2A 'central':8C 'eu':7C 'eu-central':6C 'iam':3A 'other':11 'root':5A"
}
},
{
"model": "api.resource",
"pk": "e0664164-cfda-44a4-b743-acee1c69386c",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.924Z",
"updated_at": "2024-10-18T11:16:24.557Z",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"uid": "arn:aws:iam::112233445566:root18",
"name": "",
"region": "us-west-1",
"service": "accessanalyzer",
"type": "Other",
"text_search": "'1':9C '112233445566':4A 'accessanalyzer':10 'arn':1A 'aws':2A 'iam':3A 'other':11 'root':5A 'us':7C 'us-west':6C 'west':8C"
}
},
{
"model": "api.resource",
"pk": "e1929daa-a984-4116-8131-492a48321dba",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:05.023Z",
"updated_at": "2024-10-18T11:16:24.607Z",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"uid": "arn:aws:iam::112233445566:root19",
"name": "",
"region": "ap-southeast-1",
"service": "accessanalyzer",
"type": "Other",
"text_search": "'1':9C '112233445566':4A 'accessanalyzer':10 'ap':7C 'ap-southeast':6C 'arn':1A 'aws':2A 'iam':3A 'other':11 'root':5A 'southeast':8C"
}
},
{
"model": "api.resource",
"pk": "e37bb1f1-1669-4bb3-be86-e3378ddfbcba",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.952Z",
"updated_at": "2024-10-18T11:16:24.571Z",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"uid": "arn:aws:access-analyzer:us-east-1:112233445566:analyzer/ConsoleAnalyzer-83b66ad7-d024-454e-b851-52d11cc1cf7c",
"name": "",
"region": "us-east-1",
"service": "accessanalyzer",
"type": "Other",
"text_search": "'1':9A,15C '112233445566':10A 'access':4A 'access-analyzer':3A 'accessanalyzer':16 'analyzer':5A 'analyzer/consoleanalyzer-83b66ad7-d024-454e-b851-52d11cc1cf7c':11A 'arn':1A 'aws':2A 'east':8A,14C 'other':17 'us':7A,13C 'us-east':6A,12C"
}
}
]

File diff suppressed because it is too large Load Diff

View File

@@ -1,153 +0,0 @@
[
{
"model": "api.providergroup",
"pk": "3fe28fb8-e545-424c-9b8f-69aff638f430",
"fields": {
"name": "first_group",
"inserted_at": "2024-11-13T11:36:19.503Z",
"updated_at": "2024-11-13T11:36:19.503Z",
"tenant": "12646005-9067-4d2a-a098-8bb378604362"
}
},
{
"model": "api.providergroup",
"pk": "525e91e7-f3f3-4254-bbc3-27ce1ade86b1",
"fields": {
"name": "second_group",
"inserted_at": "2024-11-13T11:36:25.421Z",
"updated_at": "2024-11-13T11:36:25.421Z",
"tenant": "12646005-9067-4d2a-a098-8bb378604362"
}
},
{
"model": "api.providergroup",
"pk": "481769f5-db2b-447b-8b00-1dee18db90ec",
"fields": {
"name": "third_group",
"inserted_at": "2024-11-13T11:36:37.603Z",
"updated_at": "2024-11-13T11:36:37.603Z",
"tenant": "12646005-9067-4d2a-a098-8bb378604362"
}
},
{
"model": "api.providergroupmembership",
"pk": "13625bd3-f428-4021-ac1b-b0bd41b6e02f",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"provider": "1b59e032-3eb6-4694-93a5-df84cd9b3ce2",
"provider_group": "3fe28fb8-e545-424c-9b8f-69aff638f430",
"inserted_at": "2024-11-13T11:55:17.138Z"
}
},
{
"model": "api.providergroupmembership",
"pk": "54784ebe-42d2-4937-aa6a-e21c62879567",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"provider_group": "3fe28fb8-e545-424c-9b8f-69aff638f430",
"inserted_at": "2024-11-13T11:55:17.138Z"
}
},
{
"model": "api.providergroupmembership",
"pk": "c8bd52d5-42a5-48fe-8e0a-3eef154b8ebe",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"provider_group": "525e91e7-f3f3-4254-bbc3-27ce1ade86b1",
"inserted_at": "2024-11-13T11:55:41.237Z"
}
},
{
"model": "api.role",
"pk": "3f01e759-bdf9-4a99-8888-1ab805b79f93",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"name": "admin_test",
"manage_users": true,
"manage_account": true,
"manage_billing": true,
"manage_providers": true,
"manage_integrations": true,
"manage_scans": true,
"unlimited_visibility": true,
"inserted_at": "2024-11-20T15:32:42.402Z",
"updated_at": "2024-11-20T15:32:42.402Z"
}
},
{
"model": "api.role",
"pk": "845ff03a-87ef-42ba-9786-6577c70c4df0",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"name": "first_role",
"manage_users": true,
"manage_account": true,
"manage_billing": true,
"manage_providers": true,
"manage_integrations": false,
"manage_scans": false,
"unlimited_visibility": true,
"inserted_at": "2024-11-20T15:31:53.239Z",
"updated_at": "2024-11-20T15:31:53.239Z"
}
},
{
"model": "api.role",
"pk": "902d726c-4bd5-413a-a2a4-f7b4754b6b20",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"name": "third_role",
"manage_users": false,
"manage_account": false,
"manage_billing": false,
"manage_providers": false,
"manage_integrations": false,
"manage_scans": true,
"unlimited_visibility": false,
"inserted_at": "2024-11-20T15:34:05.440Z",
"updated_at": "2024-11-20T15:34:05.440Z"
}
},
{
"model": "api.roleprovidergrouprelationship",
"pk": "57fd024a-0a7f-49b4-a092-fa0979a07aaf",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"role": "3f01e759-bdf9-4a99-8888-1ab805b79f93",
"provider_group": "3fe28fb8-e545-424c-9b8f-69aff638f430",
"inserted_at": "2024-11-20T15:32:42.402Z"
}
},
{
"model": "api.roleprovidergrouprelationship",
"pk": "a3cd0099-1c13-4df1-a5e5-ecdfec561b35",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"role": "3f01e759-bdf9-4a99-8888-1ab805b79f93",
"provider_group": "481769f5-db2b-447b-8b00-1dee18db90ec",
"inserted_at": "2024-11-20T15:32:42.402Z"
}
},
{
"model": "api.roleprovidergrouprelationship",
"pk": "cfd84182-a058-40c2-af3c-0189b174940f",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"role": "3f01e759-bdf9-4a99-8888-1ab805b79f93",
"provider_group": "525e91e7-f3f3-4254-bbc3-27ce1ade86b1",
"inserted_at": "2024-11-20T15:32:42.402Z"
}
},
{
"model": "api.userrolerelationship",
"pk": "92339663-e954-4fd8-98fb-8bfe15949975",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"role": "3f01e759-bdf9-4a99-8888-1ab805b79f93",
"user": "8b38e2eb-6689-4f1e-a4ba-95b275130200",
"inserted_at": "2024-11-20T15:36:14.302Z"
}
}
]

File diff suppressed because one or more lines are too long

View File

@@ -1,237 +0,0 @@
import random
from datetime import datetime, timezone
from math import ceil
from uuid import uuid4
from django.core.management.base import BaseCommand
from tqdm import tqdm
from api.db_utils import rls_transaction
from api.models import (
Finding,
Provider,
Resource,
ResourceFindingMapping,
Scan,
StatusChoices,
)
from prowler.lib.check.models import CheckMetadata
class Command(BaseCommand):
help = "Populates the database with test data for performance testing."
def add_arguments(self, parser):
parser.add_argument(
"--tenant",
type=str,
required=True,
help="Tenant id for which the data will be populated.",
)
parser.add_argument(
"--resources",
type=int,
required=True,
help="The number of resources to create.",
)
parser.add_argument(
"--findings",
type=int,
required=True,
help="The number of findings to create.",
)
parser.add_argument(
"--batch", type=int, required=True, help="The batch size for bulk creation."
)
parser.add_argument(
"--alias",
type=str,
required=False,
help="Optional alias for the provider and scan",
)
def handle(self, *args, **options):
tenant_id = options["tenant"]
num_resources = options["resources"]
num_findings = options["findings"]
batch_size = options["batch"]
alias = options["alias"] or "Testing"
uid_token = str(uuid4())
self.stdout.write(self.style.NOTICE("Starting data population"))
self.stdout.write(self.style.NOTICE(f"\tTenant: {tenant_id}"))
self.stdout.write(self.style.NOTICE(f"\tAlias: {alias}"))
self.stdout.write(self.style.NOTICE(f"\tResources: {num_resources}"))
self.stdout.write(self.style.NOTICE(f"\tFindings: {num_findings}"))
self.stdout.write(self.style.NOTICE(f"\tBatch size: {batch_size}\n\n"))
# Resource metadata
possible_regions = [
"us-east-1",
"us-east-2",
"us-west-1",
"us-west-2",
"ca-central-1",
"eu-central-1",
"eu-west-1",
"eu-west-2",
"eu-west-3",
"ap-southeast-1",
"ap-southeast-2",
"ap-northeast-1",
"ap-northeast-2",
"ap-south-1",
"sa-east-1",
]
possible_services = []
possible_types = []
bulk_check_metadata = CheckMetadata.get_bulk(provider="aws")
for check_metadata in bulk_check_metadata.values():
if check_metadata.ServiceName not in possible_services:
possible_services.append(check_metadata.ServiceName)
if (
check_metadata.ResourceType
and check_metadata.ResourceType not in possible_types
):
possible_types.append(check_metadata.ResourceType)
with rls_transaction(tenant_id):
provider, _ = Provider.all_objects.get_or_create(
tenant_id=tenant_id,
provider="aws",
connected=True,
uid=str(random.randint(100000000000, 999999999999)),
defaults={
"alias": alias,
},
)
with rls_transaction(tenant_id):
scan = Scan.all_objects.create(
tenant_id=tenant_id,
provider=provider,
name=alias,
trigger="manual",
state="executing",
progress=0,
started_at=datetime.now(timezone.utc),
)
scan_state = "completed"
try:
# Create resources
resources = []
for i in range(num_resources):
resources.append(
Resource(
tenant_id=tenant_id,
provider_id=provider.id,
uid=f"testing-{uid_token}-{i}",
name=f"Testing {uid_token}-{i}",
region=random.choice(possible_regions),
service=random.choice(possible_services),
type=random.choice(possible_types),
)
)
num_batches = ceil(len(resources) / batch_size)
self.stdout.write(self.style.WARNING("Creating resources..."))
for i in tqdm(range(0, len(resources), batch_size), total=num_batches):
with rls_transaction(tenant_id):
Resource.all_objects.bulk_create(resources[i : i + batch_size])
self.stdout.write(self.style.SUCCESS("Resources created successfully.\n\n"))
with rls_transaction(tenant_id):
scan.progress = 33
scan.save()
# Create Findings
findings = []
possible_deltas = ["new", "changed", None]
possible_severities = ["critical", "high", "medium", "low"]
findings_resources_mapping = []
for i in range(num_findings):
severity = random.choice(possible_severities)
check_id = random.randint(1, 1000)
assigned_resource_num = random.randint(0, len(resources) - 1)
assigned_resource = resources[assigned_resource_num]
findings_resources_mapping.append(assigned_resource_num)
findings.append(
Finding(
tenant_id=tenant_id,
scan=scan,
uid=f"testing-{uid_token}-{i}",
delta=random.choice(possible_deltas),
check_id=f"check-{check_id}",
status=random.choice(list(StatusChoices)),
severity=severity,
impact=severity,
raw_result={},
check_metadata={
"checktitle": f"Test title for check {check_id}",
"risk": f"Testing risk {uid_token}-{i}",
"provider": "aws",
"severity": severity,
"categories": ["category1", "category2", "category3"],
"description": "This is a random description that should not matter for testing purposes.",
"servicename": assigned_resource.service,
"resourcetype": assigned_resource.type,
},
)
)
num_batches = ceil(len(findings) / batch_size)
self.stdout.write(self.style.WARNING("Creating findings..."))
for i in tqdm(range(0, len(findings), batch_size), total=num_batches):
with rls_transaction(tenant_id):
Finding.all_objects.bulk_create(findings[i : i + batch_size])
self.stdout.write(self.style.SUCCESS("Findings created successfully.\n\n"))
with rls_transaction(tenant_id):
scan.progress = 66
scan.save()
# Create ResourceFindingMapping
mappings = []
for index, f in enumerate(findings):
mappings.append(
ResourceFindingMapping(
tenant_id=tenant_id,
resource=resources[findings_resources_mapping[index]],
finding=f,
)
)
num_batches = ceil(len(mappings) / batch_size)
self.stdout.write(
self.style.WARNING("Creating resource-finding mappings...")
)
for i in tqdm(range(0, len(mappings), batch_size), total=num_batches):
with rls_transaction(tenant_id):
ResourceFindingMapping.objects.bulk_create(
mappings[i : i + batch_size]
)
self.stdout.write(
self.style.SUCCESS(
"Resource-finding mappings created successfully.\n\n"
)
)
except Exception as e:
self.stdout.write(self.style.ERROR(f"Failed to populate test data: {e}"))
scan_state = "failed"
finally:
scan.completed_at = datetime.now(timezone.utc)
scan.duration = int(
(datetime.now(timezone.utc) - scan.started_at).total_seconds()
)
scan.progress = 100
scan.state = scan_state
scan.unique_resource_count = num_resources
with rls_transaction(tenant_id):
scan.save()
self.stdout.write(self.style.NOTICE("Successfully populated test data."))

View File

@@ -1,49 +0,0 @@
import logging
import time
from config.custom_logging import BackendLogger
def extract_auth_info(request) -> dict:
if getattr(request, "auth", None) is not None:
tenant_id = request.auth.get("tenant_id", "N/A")
user_id = request.auth.get("sub", "N/A")
else:
tenant_id, user_id = "N/A", "N/A"
return {"tenant_id": tenant_id, "user_id": user_id}
class APILoggingMiddleware:
"""
Middleware for logging API requests.
This middleware logs details of API requests, including the typical request metadata among other useful information.
Args:
get_response (Callable): A callable to get the response, typically the next middleware or view.
"""
def __init__(self, get_response):
self.get_response = get_response
self.logger = logging.getLogger(BackendLogger.API)
def __call__(self, request):
request_start_time = time.time()
response = self.get_response(request)
duration = time.time() - request_start_time
auth_info = extract_auth_info(request)
self.logger.info(
"",
extra={
"user_id": auth_info["user_id"],
"tenant_id": auth_info["tenant_id"],
"method": request.method,
"path": request.path,
"query_params": request.GET.dict(),
"status_code": response.status_code,
"duration": duration,
},
)
return response

File diff suppressed because it is too large Load Diff

View File

@@ -1,23 +0,0 @@
from django.conf import settings
from django.db import migrations
from api.db_utils import DB_PROWLER_USER
DB_NAME = settings.DATABASES["default"]["NAME"]
class Migration(migrations.Migration):
dependencies = [
("api", "0001_initial"),
("token_blacklist", "0012_alter_outstandingtoken_user"),
]
operations = [
migrations.RunSQL(
f"""
GRANT SELECT, INSERT, UPDATE, DELETE ON token_blacklist_blacklistedtoken TO {DB_PROWLER_USER};
GRANT SELECT, INSERT, UPDATE, DELETE ON token_blacklist_outstandingtoken TO {DB_PROWLER_USER};
GRANT SELECT, DELETE ON django_admin_log TO {DB_PROWLER_USER};
"""
),
]

View File

@@ -1,23 +0,0 @@
# Generated by Django 5.1.1 on 2024-12-20 13:16
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0002_token_migrations"),
]
operations = [
migrations.RemoveConstraint(
model_name="provider",
name="unique_provider_uids",
),
migrations.AddConstraint(
model_name="provider",
constraint=models.UniqueConstraint(
fields=("tenant_id", "provider", "uid", "is_deleted"),
name="unique_provider_uids",
),
),
]

View File

@@ -1,248 +0,0 @@
# Generated by Django 5.1.1 on 2024-12-05 12:29
import uuid
import django.db.models.deletion
from django.conf import settings
from django.db import migrations, models
import api.rls
class Migration(migrations.Migration):
dependencies = [
("api", "0003_update_provider_unique_constraint_with_is_deleted"),
]
operations = [
migrations.CreateModel(
name="Role",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("name", models.CharField(max_length=255)),
("manage_users", models.BooleanField(default=False)),
("manage_account", models.BooleanField(default=False)),
("manage_billing", models.BooleanField(default=False)),
("manage_providers", models.BooleanField(default=False)),
("manage_integrations", models.BooleanField(default=False)),
("manage_scans", models.BooleanField(default=False)),
("unlimited_visibility", models.BooleanField(default=False)),
("inserted_at", models.DateTimeField(auto_now_add=True)),
("updated_at", models.DateTimeField(auto_now=True)),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={
"db_table": "roles",
},
),
migrations.CreateModel(
name="RoleProviderGroupRelationship",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("inserted_at", models.DateTimeField(auto_now_add=True)),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={
"db_table": "role_provider_group_relationship",
},
),
migrations.CreateModel(
name="UserRoleRelationship",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("inserted_at", models.DateTimeField(auto_now_add=True)),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={
"db_table": "role_user_relationship",
},
),
migrations.AddField(
model_name="roleprovidergrouprelationship",
name="provider_group",
field=models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.providergroup"
),
),
migrations.AddField(
model_name="roleprovidergrouprelationship",
name="role",
field=models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.role"
),
),
migrations.AddField(
model_name="role",
name="provider_groups",
field=models.ManyToManyField(
related_name="roles",
through="api.RoleProviderGroupRelationship",
to="api.providergroup",
),
),
migrations.AddField(
model_name="userrolerelationship",
name="role",
field=models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.role"
),
),
migrations.AddField(
model_name="userrolerelationship",
name="user",
field=models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL
),
),
migrations.AddField(
model_name="role",
name="users",
field=models.ManyToManyField(
related_name="roles",
through="api.UserRoleRelationship",
to=settings.AUTH_USER_MODEL,
),
),
migrations.AddConstraint(
model_name="roleprovidergrouprelationship",
constraint=models.UniqueConstraint(
fields=("role_id", "provider_group_id"),
name="unique_role_provider_group_relationship",
),
),
migrations.AddConstraint(
model_name="roleprovidergrouprelationship",
constraint=api.rls.RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_roleprovidergrouprelationship",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
migrations.AddConstraint(
model_name="userrolerelationship",
constraint=models.UniqueConstraint(
fields=("role_id", "user_id"), name="unique_role_user_relationship"
),
),
migrations.AddConstraint(
model_name="userrolerelationship",
constraint=api.rls.RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_userrolerelationship",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
migrations.AddConstraint(
model_name="role",
constraint=models.UniqueConstraint(
fields=("tenant_id", "name"), name="unique_role_per_tenant"
),
),
migrations.AddConstraint(
model_name="role",
constraint=api.rls.RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_role",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
migrations.CreateModel(
name="InvitationRoleRelationship",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("inserted_at", models.DateTimeField(auto_now_add=True)),
(
"invitation",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.invitation"
),
),
(
"role",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.role"
),
),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={
"db_table": "role_invitation_relationship",
},
),
migrations.AddConstraint(
model_name="invitationrolerelationship",
constraint=models.UniqueConstraint(
fields=("role_id", "invitation_id"),
name="unique_role_invitation_relationship",
),
),
migrations.AddConstraint(
model_name="invitationrolerelationship",
constraint=api.rls.RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_invitationrolerelationship",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
migrations.AddField(
model_name="role",
name="invitations",
field=models.ManyToManyField(
related_name="roles",
through="api.InvitationRoleRelationship",
to="api.invitation",
),
),
]

View File

@@ -1,44 +0,0 @@
from django.db import migrations
from api.db_router import MainRouter
def create_admin_role(apps, schema_editor):
Tenant = apps.get_model("api", "Tenant")
Role = apps.get_model("api", "Role")
User = apps.get_model("api", "User")
UserRoleRelationship = apps.get_model("api", "UserRoleRelationship")
for tenant in Tenant.objects.using(MainRouter.admin_db).all():
admin_role, _ = Role.objects.using(MainRouter.admin_db).get_or_create(
name="admin",
tenant=tenant,
defaults={
"manage_users": True,
"manage_account": True,
"manage_billing": True,
"manage_providers": True,
"manage_integrations": True,
"manage_scans": True,
"unlimited_visibility": True,
},
)
users = User.objects.using(MainRouter.admin_db).filter(
membership__tenant=tenant
)
for user in users:
UserRoleRelationship.objects.using(MainRouter.admin_db).get_or_create(
user=user,
role=admin_role,
tenant=tenant,
)
class Migration(migrations.Migration):
dependencies = [
("api", "0004_rbac"),
]
operations = [
migrations.RunPython(create_admin_role),
]

View File

@@ -1,15 +0,0 @@
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0005_rbac_missing_admin_roles"),
]
operations = [
migrations.AddField(
model_name="finding",
name="first_seen_at",
field=models.DateTimeField(editable=False, null=True),
),
]

View File

@@ -1,25 +0,0 @@
# Generated by Django 5.1.5 on 2025-01-28 15:03
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0006_findings_first_seen"),
]
operations = [
migrations.AddIndex(
model_name="scan",
index=models.Index(
fields=["tenant_id", "provider_id", "state", "inserted_at"],
name="scans_prov_state_insert_idx",
),
),
migrations.AddIndex(
model_name="scansummary",
index=models.Index(
fields=["tenant_id", "scan_id"], name="scan_summaries_tenant_scan_idx"
),
),
]

View File

@@ -1,64 +0,0 @@
import json
from datetime import datetime, timedelta, timezone
import django.db.models.deletion
from django.db import migrations, models
from django_celery_beat.models import PeriodicTask
from api.db_utils import rls_transaction
from api.models import Scan, StateChoices
def migrate_daily_scheduled_scan_tasks(apps, schema_editor):
for daily_scheduled_scan_task in PeriodicTask.objects.filter(
task="scan-perform-scheduled"
):
task_kwargs = json.loads(daily_scheduled_scan_task.kwargs)
tenant_id = task_kwargs["tenant_id"]
provider_id = task_kwargs["provider_id"]
current_time = datetime.now(timezone.utc)
scheduled_time_today = datetime.combine(
current_time.date(),
daily_scheduled_scan_task.start_time.time(),
tzinfo=timezone.utc,
)
if current_time < scheduled_time_today:
next_scan_date = scheduled_time_today
else:
next_scan_date = scheduled_time_today + timedelta(days=1)
with rls_transaction(tenant_id):
Scan.objects.create(
tenant_id=tenant_id,
name="Daily scheduled scan",
provider_id=provider_id,
trigger=Scan.TriggerChoices.SCHEDULED,
state=StateChoices.SCHEDULED,
scheduled_at=next_scan_date,
scheduler_task_id=daily_scheduled_scan_task.id,
)
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0007_scan_and_scan_summaries_indexes"),
("django_celery_beat", "0019_alter_periodictasks_options"),
]
operations = [
migrations.AddField(
model_name="scan",
name="scheduler_task",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.CASCADE,
to="django_celery_beat.periodictask",
),
),
migrations.RunPython(migrate_daily_scheduled_scan_tasks),
]

View File

@@ -1,22 +0,0 @@
# Generated by Django 5.1.5 on 2025-02-07 09:42
import django.core.validators
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0008_daily_scheduled_tasks_update"),
]
operations = [
migrations.AlterField(
model_name="provider",
name="uid",
field=models.CharField(
max_length=250,
validators=[django.core.validators.MinLengthValidator(3)],
verbose_name="Unique identifier for the provider, set by the provider",
),
),
]

View File

@@ -1,109 +0,0 @@
from functools import partial
from django.db import connection, migrations
def create_index_on_partitions(
apps, schema_editor, parent_table: str, index_name: str, index_details: str
):
with connection.cursor() as cursor:
cursor.execute(
"""
SELECT inhrelid::regclass::text
FROM pg_inherits
WHERE inhparent = %s::regclass;
""",
[parent_table],
)
partitions = [row[0] for row in cursor.fetchall()]
# Iterate over partitions and create index concurrently.
# Note: PostgreSQL does not allow CONCURRENTLY inside a transaction,
# so we need atomic = False for this migration.
for partition in partitions:
sql = (
f"CREATE INDEX CONCURRENTLY IF NOT EXISTS {partition.replace('.', '_')}_{index_name} ON {partition} "
f"{index_details};"
)
schema_editor.execute(sql)
def drop_index_on_partitions(apps, schema_editor, parent_table: str, index_name: str):
with schema_editor.connection.cursor() as cursor:
cursor.execute(
"""
SELECT inhrelid::regclass::text
FROM pg_inherits
WHERE inhparent = %s::regclass;
""",
[parent_table],
)
partitions = [row[0] for row in cursor.fetchall()]
# Iterate over partitions and drop index concurrently.
for partition in partitions:
partition_index = f"{partition.replace('.', '_')}_{index_name}"
sql = f"DROP INDEX CONCURRENTLY IF EXISTS {partition_index};"
schema_editor.execute(sql)
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0009_increase_provider_uid_maximum_length"),
]
operations = [
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="findings_tenant_and_id_idx",
index_details="(tenant_id, id)",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="findings_tenant_and_id_idx",
),
),
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="find_tenant_scan_idx",
index_details="(tenant_id, scan_id)",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="find_tenant_scan_idx",
),
),
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="find_tenant_scan_id_idx",
index_details="(tenant_id, scan_id, id)",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="find_tenant_scan_id_idx",
),
),
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="find_delta_new_idx",
index_details="(tenant_id, id) where delta = 'new'",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="find_delta_new_idx",
),
),
]

View File

@@ -1,49 +0,0 @@
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0010_findings_performance_indexes_partitions"),
]
operations = [
migrations.AddIndex(
model_name="finding",
index=models.Index(
fields=["tenant_id", "id"], name="findings_tenant_and_id_idx"
),
),
migrations.AddIndex(
model_name="finding",
index=models.Index(
fields=["tenant_id", "scan_id"], name="find_tenant_scan_idx"
),
),
migrations.AddIndex(
model_name="finding",
index=models.Index(
fields=["tenant_id", "scan_id", "id"], name="find_tenant_scan_id_idx"
),
),
migrations.AddIndex(
model_name="finding",
index=models.Index(
condition=models.Q(("delta", "new")),
fields=["tenant_id", "id"],
name="find_delta_new_idx",
),
),
migrations.AddIndex(
model_name="resourcetagmapping",
index=models.Index(
fields=["tenant_id", "resource_id"], name="resource_tag_tenant_idx"
),
),
migrations.AddIndex(
model_name="resource",
index=models.Index(
fields=["tenant_id", "service", "region", "type"],
name="resource_tenant_metadata_idx",
),
),
]

View File

@@ -1,15 +0,0 @@
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0011_findings_performance_indexes_parent"),
]
operations = [
migrations.AddField(
model_name="scan",
name="output_location",
field=models.CharField(blank=True, max_length=200, null=True),
),
]

View File

@@ -1,35 +0,0 @@
# Generated by Django 5.1.5 on 2025-03-03 15:46
from functools import partial
from django.db import migrations
from api.db_utils import IntegrationTypeEnum, PostgresEnumMigration, register_enum
from api.models import Integration
IntegrationTypeEnumMigration = PostgresEnumMigration(
enum_name="integration_type",
enum_values=tuple(
integration_type[0]
for integration_type in Integration.IntegrationChoices.choices
),
)
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0012_scan_report_output"),
]
operations = [
migrations.RunPython(
IntegrationTypeEnumMigration.create_enum_type,
reverse_code=IntegrationTypeEnumMigration.drop_enum_type,
),
migrations.RunPython(
partial(register_enum, enum_class=IntegrationTypeEnum),
reverse_code=migrations.RunPython.noop,
),
]

View File

@@ -1,131 +0,0 @@
# Generated by Django 5.1.5 on 2025-03-03 15:46
import uuid
import django.db.models.deletion
from django.db import migrations, models
import api.db_utils
import api.rls
from api.rls import RowLevelSecurityConstraint
class Migration(migrations.Migration):
dependencies = [
("api", "0013_integrations_enum"),
]
operations = [
migrations.CreateModel(
name="Integration",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("inserted_at", models.DateTimeField(auto_now_add=True)),
("updated_at", models.DateTimeField(auto_now=True)),
("enabled", models.BooleanField(default=False)),
("connected", models.BooleanField(blank=True, null=True)),
(
"connection_last_checked_at",
models.DateTimeField(blank=True, null=True),
),
(
"integration_type",
api.db_utils.IntegrationTypeEnumField(
choices=[
("amazon_s3", "Amazon S3"),
("saml", "SAML"),
("aws_security_hub", "AWS Security Hub"),
("jira", "JIRA"),
("slack", "Slack"),
]
),
),
("configuration", models.JSONField(default=dict)),
("_credentials", models.BinaryField(db_column="credentials")),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={"db_table": "integrations", "abstract": False},
),
migrations.AddConstraint(
model_name="integration",
constraint=RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_integration",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
migrations.CreateModel(
name="IntegrationProviderRelationship",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("inserted_at", models.DateTimeField(auto_now_add=True)),
(
"integration",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="api.integration",
),
),
(
"provider",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.provider"
),
),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={
"db_table": "integration_provider_mappings",
"constraints": [
models.UniqueConstraint(
fields=("integration_id", "provider_id"),
name="unique_integration_provider_rel",
),
],
},
),
migrations.AddConstraint(
model_name="IntegrationProviderRelationship",
constraint=RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_integrationproviderrelationship",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
migrations.AddField(
model_name="integration",
name="providers",
field=models.ManyToManyField(
blank=True,
related_name="integrations",
through="api.IntegrationProviderRelationship",
to="api.provider",
),
),
]

View File

@@ -1,26 +0,0 @@
# Generated by Django 5.1.5 on 2025-03-25 11:29
from django.db import migrations, models
import api.db_utils
class Migration(migrations.Migration):
dependencies = [
("api", "0014_integrations"),
]
operations = [
migrations.AddField(
model_name="finding",
name="muted",
field=models.BooleanField(default=False),
),
migrations.AlterField(
model_name="finding",
name="status",
field=api.db_utils.StatusEnumField(
choices=[("FAIL", "Fail"), ("PASS", "Pass"), ("MANUAL", "Manual")]
),
),
]

View File

@@ -1,32 +0,0 @@
# Generated by Django 5.1.5 on 2025-03-31 10:46
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0015_finding_muted"),
]
operations = [
migrations.AddField(
model_name="finding",
name="compliance",
field=models.JSONField(blank=True, default=dict, null=True),
),
migrations.AddField(
model_name="resource",
name="details",
field=models.TextField(blank=True, null=True),
),
migrations.AddField(
model_name="resource",
name="metadata",
field=models.TextField(blank=True, null=True),
),
migrations.AddField(
model_name="resource",
name="partition",
field=models.TextField(blank=True, null=True),
),
]

View File

@@ -1,28 +0,0 @@
# Generated by Django 5.1.7 on 2025-04-16 08:47
from django.db import migrations
import api.db_utils
class Migration(migrations.Migration):
dependencies = [
("api", "0016_finding_compliance_resource_details_and_more"),
]
operations = [
migrations.AlterField(
model_name="provider",
name="provider",
field=api.db_utils.ProviderEnumField(
choices=[
("aws", "AWS"),
("azure", "Azure"),
("gcp", "GCP"),
("kubernetes", "Kubernetes"),
("m365", "M365"),
],
default="aws",
),
),
]

Some files were not shown because too many files have changed in this diff Show More