Compare commits

...

93 Commits

Author SHA1 Message Date
Pepe Fagoaga 37d9da55d3 Merge branch 'master' of github.com:prowler-cloud/prowler into showdown-demo 2025-06-05 11:07:08 +02:00
Pepe Fagoaga 23389d011a fix(ui): call API with lighthouse-configuration 2025-06-05 10:03:39 +02:00
Pepe Fagoaga 47d5de2b82 chore: docker compose dev for demo 2025-06-05 10:03:11 +02:00
Pepe Fagoaga b827ed9267 fix: api json body type 2025-06-05 07:54:51 +02:00
Pepe Fagoaga 3dff1a9d7a Merge branch 'showdown-demo' of github.com:prowler-cloud/prowler into showdown-demo 2025-06-04 21:40:52 +02:00
César Arroba 370da650ee chore: add missing migrations 2025-06-04 19:43:53 +02:00
Pepe Fagoaga 6694736ff9 fix: lock 2025-06-04 18:04:42 +02:00
Pepe Fagoaga 3b36ac63c5 fix: version 2025-06-04 16:01:33 +02:00
Pepe Fagoaga 65e0b94e21 fix: deps 2025-06-04 14:58:49 +02:00
Pepe Fagoaga 40fda374a8 fix: rename payload type 2025-06-04 14:45:18 +02:00
Pepe Fagoaga 4489d2c7f8 chore: rename api endpoints for lighthouse 2025-06-04 14:25:44 +02:00
Pepe Fagoaga 5e64618189 chore: rename migration 2025-06-04 13:44:31 +02:00
Pepe Fagoaga e0819be942 Merge branch 'master' of github.com:prowler-cloud/prowler into showdown-demo 2025-06-04 13:44:15 +02:00
Chandrapal Badshah baaf3fea13 Add resources file 2025-06-04 13:16:50 +02:00
Pepe Fagoaga 78de7354cf chore(ui): lint 2025-06-04 13:12:20 +02:00
Pepe Fagoaga 4210b0fab0 Merge branch 'master' of github.com:prowler-cloud/prowler into showdown-demo 2025-06-04 13:01:23 +02:00
Pepe Fagoaga d98a28e7f9 chore(ui): lint 2025-06-04 12:46:14 +02:00
Pepe Fagoaga 5ea74fbeb2 chore: ignore log error 2025-06-04 12:38:29 +02:00
Chandrapal Badshah 6259df4f8c Add actions to fetch data from Prowler Hub 2025-06-04 12:17:55 +02:00
Chandrapal Badshah 788b092dff Use CustomTextarea component 2025-06-04 12:17:48 +02:00
Chandrapal Badshah b8538e6ac5 Use CustomButton component 2025-06-04 12:17:39 +02:00
Pepe Fagoaga d89cb3cced fix: comment not found modules 2025-06-04 11:52:14 +02:00
Pepe Fagoaga 14c8afd131 Merge remote-tracking branch 'origin/PRWLR-7300-create-ia-c-provider' into showdown-demo 2025-06-04 11:47:29 +02:00
Pepe Fagoaga 15ad7f1284 Merge remote-tracking branch 'origin/PRWLR-7160-findings-page-scan-id-filter-improvement' into showdown-demo 2025-06-04 11:47:14 +02:00
Pepe Fagoaga dfecc6eed7 Merge remote-tracking branch 'origin/feature/add-resource-inventory' into showdown-demo 2025-06-04 11:43:26 +02:00
Pepe Fagoaga b100043c8b Merge remote-tracking branch 'origin/PRWLR-7297-django-endpoint' into showdown-demo 2025-06-04 11:41:36 +02:00
Pepe Fagoaga a71271f429 Merge remote-tracking branch 'origin/PRWLR-7302-nextjs-analyst' into showdown-demo 2025-06-04 11:39:48 +02:00
Chandrapal Badshah 8b41dceb1c Revert changes to package-lock 2025-06-04 15:03:04 +05:30
Pablo Lara 8a8586f298 chore: scan attributes are always available in compliance-scan-info component 2025-06-04 11:26:41 +02:00
Chandrapal Badshah 15f98d79e0 Integrate Prowler Hub 2025-06-04 13:44:53 +05:30
Chandrapal Badshah 67fe87cfd4 Move layout to lighthouse config page 2025-06-04 13:18:42 +05:30
Pablo Lara e9a22c1513 Merge branch 'master' into PRWLR-7160-findings-page-scan-id-filter-improvement 2025-06-03 15:59:52 +02:00
Pablo Lara 5b22dcb7ac chore: remove unused types 2025-06-03 12:19:13 +02:00
sumit_chaturvedi d8683630a9 srn Lint fix 2025-06-03 12:18:53 +05:30
sumit_chaturvedi d2d68ded2b Merge branch 'master' into PRWLR-7160-findings-page-scan-id-filter-improvement
# Conflicts:
#	ui/CHANGELOG.md
#	ui/app/(prowler)/compliance/page.tsx
#	ui/types/components.ts
2025-06-03 12:17:29 +05:30
sumit_chaturvedi fe19fdf1fa srn Lint fixes 2025-06-03 12:06:22 +05:30
sumit_chaturvedi caef85e771 srn Provider details UI update 2025-06-03 12:03:52 +05:30
sumit_chaturvedi eebb64a25b srn Updated chips to be round 2025-06-03 11:57:23 +05:30
sumit_chaturvedi cf6e6ac03d srn updated check 2025-06-03 11:51:00 +05:30
sumit_chaturvedi f8d485e868 Merge branch 'master' into feature/add-resource-inventory
# Conflicts:
#	ui/types/index.ts
2025-06-03 11:44:58 +05:30
Chandrapal Badshah 430939c02e Update lighthouse serializer 2025-06-03 11:31:30 +05:30
Chandrapal Badshah a9ed8d1fd0 Fix view tests 2025-06-02 21:02:12 +05:30
Chandrapal Badshah 0689326c28 Fix Lighthouse PR review comments 2025-06-02 20:31:10 +05:30
Chandrapal Badshah 9e62a5398f Add Lighthouse chat interface 2025-06-02 13:50:28 +05:30
sumit_chaturvedi 9c19f9bc0c srn build fix 2025-06-02 09:03:31 +05:30
sumit_chaturvedi c17001bb4f srn refactoring 2025-06-02 08:53:29 +05:30
sumit_chaturvedi 1dc8636276 Merge branch 'master' into PRWLR-7160-findings-page-scan-id-filter-improvement
# Conflicts:
#	ui/CHANGELOG.md
#	ui/components/ui/custom/custom-dropdown-filter.tsx
2025-06-02 08:33:33 +05:30
Andoni A. 330903dc27 fix(iac): fix tests 2025-05-29 19:02:51 +02:00
Andoni A. d24f395f44 fix(iac): return checkov stderr as error log 2025-05-29 17:58:34 +02:00
MrCloudSec 469bc13e5f fix: iac test 2025-05-29 14:28:30 +02:00
MrCloudSec 0928f47b35 chore: improve checkov error 2025-05-29 12:49:58 +02:00
Andoni Alonso 7ec9f6572f chore(iac): add iac provider tests (#7874) 2025-05-29 12:39:58 +02:00
MrCloudSec e6ec364bdb Merge branch 'master' into PRWLR-7300-create-ia-c-provider 2025-05-29 12:39:19 +02:00
pedrooot d79ee1c5f6 feat(compliance): update profile validation from CIS 2025-05-29 10:57:59 +02:00
MrCloudSec fdff4f4cde fix: use v1 basemodel of pydantic 2025-05-29 10:50:46 +02:00
MrCloudSec 66e0b6a0b2 fix: m365 test 2025-05-29 10:47:11 +02:00
MrCloudSec cafb7d544e chore: add github action for iac 2025-05-29 10:43:48 +02:00
MrCloudSec 34078df65b fix: nhn tests 2025-05-29 10:42:23 +02:00
MrCloudSec 0a30ecaaaf fix: azure tests 2025-05-29 10:35:13 +02:00
MrCloudSec d6fbd14baa fix: tests k8s 2025-05-29 10:23:08 +02:00
MrCloudSec 00f031e99b fix: gcp tests 2025-05-29 10:21:32 +02:00
MrCloudSec ba225b3882 chore: add preview label 2025-05-28 17:43:29 +02:00
MrCloudSec e12edc7ea3 chore: handle checkov errors 2025-05-28 16:19:43 +02:00
Andoni Alonso 3393908208 feat(IaC): handling output for IaC Provider POC (#7861)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-28 15:54:33 +02:00
sumit_chaturvedi 97d216c49f Merge branch 'master' into PRWLR-7160-findings-page-scan-id-filter-improvement
# Conflicts:
#	ui/CHANGELOG.md
#	ui/app/(prowler)/findings/page.tsx
#	ui/components/ui/custom/custom-dropdown-filter.tsx
#	ui/types/filters.ts
2025-05-28 17:58:39 +05:30
MrCloudSec c0fe90b0c9 chore: add label 2025-05-27 17:53:29 +02:00
MrCloudSec d6566f2da7 fix: poetry lock 2025-05-27 17:51:57 +02:00
MrCloudSec a71ec4711b chore: revision 2025-05-27 17:43:27 +02:00
MrCloudSec 4ed803331a fix: tests 2025-05-27 17:41:51 +02:00
MrCloudSec 01e47d6c8b fix: tests 2025-05-27 17:15:10 +02:00
MrCloudSec a75452df03 fix: models 2025-05-27 16:48:56 +02:00
MrCloudSec 290a54ce21 fix: remove unnecessary file 2025-05-27 16:20:48 +02:00
MrCloudSec daaa26bb20 feat(IaC): POC for IaC Provider 2025-05-27 16:17:43 +02:00
Chandrapal Badshah f4febf9e08 Fix regex check in creation of config 2025-05-27 16:28:57 +05:30
Chandrapal Badshah 57acb43251 Add Lighthouse django endpoints 2025-05-27 16:28:57 +05:30
sumit_chaturvedi f87eaeac6b docs: changelog update 2025-05-23 13:39:29 +05:30
sumit_chaturvedi ff995e7750 fix(ui): Improved the findings page scan Id filter 2025-05-23 13:17:26 +05:30
Pablo Lara 50f5efb462 feat: add feedback-banner component 2025-05-22 10:41:01 +02:00
Pablo Lara a75788921c chore: tweak styles 2025-05-22 09:45:56 +02:00
Pablo Lara 6bbf651895 chore: tweak styles 2025-05-22 09:24:23 +02:00
sumit_chaturvedi 13fae4a300 Merge branch 'master' into feature/add-resource-inventory 2025-05-21 21:56:57 +05:30
sumit_chaturvedi c1380e397c srn Helper method correction 2025-05-21 21:56:32 +05:30
sumit_chaturvedi 0206674fc9 srn Updated failed findings count 2025-05-21 21:39:48 +05:30
sumit_chaturvedi b9c5fa71a7 srn lint fix 2025-05-21 20:51:53 +05:30
sumit_chaturvedi 8afd3f9b2b srn Fixed PR comments 2025-05-21 20:46:56 +05:30
sumit_chaturvedi 38a72da08b Merge branch 'master' into feature/add-resource-inventory 2025-05-21 08:52:59 +05:30
sumit_chaturvedi 7545bba889 Merge branch 'master' into feature/add-resource-inventory 2025-05-16 22:59:31 +05:30
Pablo Lara 72137590fe refactor: remove getScansByFields and use getScans with flexible filters 2025-05-16 13:36:16 +02:00
Pablo Lara 9365eaf3da fix: fix import for ProviderType 2025-05-16 11:58:32 +02:00
Pablo Lara ac68119705 Merge branch 'master' into feature/add-resource-inventory 2025-05-16 11:53:47 +02:00
sumit_chaturvedi 1a7ae2c7bb Merge branch 'upstream/master' into feature/add-resource-inventory 2025-05-06 09:44:33 +05:30
sumit_chaturvedi 9639082e20 srn Fix lint 2025-04-28 10:25:04 +05:30
sumit_chaturvedi a2f9f54edd feat(ui): Add resources view as inventory 2025-04-28 08:53:00 +05:30
246 changed files with 9287 additions and 960 deletions
+6 -1
View File
@@ -127,7 +127,7 @@ SENTRY_ENVIRONMENT=local
SENTRY_RELEASE=local
#### Prowler release version ####
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.6.0
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.8.0 - DEMO
# Social login credentials
SOCIAL_GOOGLE_OAUTH_CALLBACK_URL="${AUTH_URL}/api/auth/callback/google"
@@ -137,3 +137,8 @@ SOCIAL_GOOGLE_OAUTH_CLIENT_SECRET=""
SOCIAL_GITHUB_OAUTH_CALLBACK_URL="${AUTH_URL}/api/auth/callback/github"
SOCIAL_GITHUB_OAUTH_CLIENT_ID=""
SOCIAL_GITHUB_OAUTH_CLIENT_SECRET=""
LANGSMITH_TRACING=false
LANGSMITH_ENDPOINT="https://api.smith.langchain.com"
LANGSMITH_API_KEY=""
LANGCHAIN_PROJECT=""
+5
View File
@@ -27,6 +27,11 @@ provider/github:
- any-glob-to-any-file: "prowler/providers/github/**"
- any-glob-to-any-file: "tests/providers/github/**"
provider/iac:
- changed-files:
- any-glob-to-any-file: "prowler/providers/iac/**"
- any-glob-to-any-file: "tests/providers/iac/**"
github_actions:
- changed-files:
- any-glob-to-any-file: ".github/workflows/*"
+15
View File
@@ -212,6 +212,21 @@ jobs:
run: |
poetry run pytest -n auto --cov=./prowler/providers/m365 --cov-report=xml:m365_coverage.xml tests/providers/m365
# Test IaC
- name: IaC - Check if any file has changed
id: iac-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/iac/**
./tests/providers/iac/**
.poetry.lock
- name: IaC - Test
if: steps.iac-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/iac --cov-report=xml:iac_coverage.xml tests/providers/iac
# Common Tests
- name: Lib - Test
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
+1 -1
View File
@@ -115,7 +115,7 @@ repos:
- id: safety
name: safety
description: "Safety is a tool that checks your installed dependencies for known security vulnerabilities"
entry: bash -c 'safety check --ignore 70612,66963,74429'
entry: bash -c 'safety check --ignore 70612,66963,74429,76352,76353'
language: system
- id: vulture
+133 -2
View File
@@ -1448,6 +1448,18 @@ files = [
graph = ["objgraph (>=1.7.2)"]
profile = ["gprof2dot (>=2022.7.29)"]
[[package]]
name = "distro"
version = "1.9.0"
description = "Distro - an OS platform information API"
optional = false
python-versions = ">=3.6"
groups = ["main"]
files = [
{file = "distro-1.9.0-py3-none-any.whl", hash = "sha256:7bffd925d65168f85027d8da9af6bddab658135b840670a223589bc0c8ef02b2"},
{file = "distro-1.9.0.tar.gz", hash = "sha256:2fa77c6fd8940f116ee1d6b94a2f90b13b5ea8d019b98bc8bafdcabcdd9bdbed"},
]
[[package]]
name = "dj-rest-auth"
version = "7.0.1"
@@ -2470,6 +2482,93 @@ MarkupSafe = ">=2.0"
[package.extras]
i18n = ["Babel (>=2.7)"]
[[package]]
name = "jiter"
version = "0.10.0"
description = "Fast iterable JSON parser."
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "jiter-0.10.0-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:cd2fb72b02478f06a900a5782de2ef47e0396b3e1f7d5aba30daeb1fce66f303"},
{file = "jiter-0.10.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:32bb468e3af278f095d3fa5b90314728a6916d89ba3d0ffb726dd9bf7367285e"},
{file = "jiter-0.10.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aa8b3e0068c26ddedc7abc6fac37da2d0af16b921e288a5a613f4b86f050354f"},
{file = "jiter-0.10.0-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:286299b74cc49e25cd42eea19b72aa82c515d2f2ee12d11392c56d8701f52224"},
{file = "jiter-0.10.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6ed5649ceeaeffc28d87fb012d25a4cd356dcd53eff5acff1f0466b831dda2a7"},
{file = "jiter-0.10.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2ab0051160cb758a70716448908ef14ad476c3774bd03ddce075f3c1f90a3d6"},
{file = "jiter-0.10.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:03997d2f37f6b67d2f5c475da4412be584e1cec273c1cfc03d642c46db43f8cf"},
{file = "jiter-0.10.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:c404a99352d839fed80d6afd6c1d66071f3bacaaa5c4268983fc10f769112e90"},
{file = "jiter-0.10.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:66e989410b6666d3ddb27a74c7e50d0829704ede652fd4c858e91f8d64b403d0"},
{file = "jiter-0.10.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:b532d3af9ef4f6374609a3bcb5e05a1951d3bf6190dc6b176fdb277c9bbf15ee"},
{file = "jiter-0.10.0-cp310-cp310-win32.whl", hash = "sha256:da9be20b333970e28b72edc4dff63d4fec3398e05770fb3205f7fb460eb48dd4"},
{file = "jiter-0.10.0-cp310-cp310-win_amd64.whl", hash = "sha256:f59e533afed0c5b0ac3eba20d2548c4a550336d8282ee69eb07b37ea526ee4e5"},
{file = "jiter-0.10.0-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:3bebe0c558e19902c96e99217e0b8e8b17d570906e72ed8a87170bc290b1e978"},
{file = "jiter-0.10.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:558cc7e44fd8e507a236bee6a02fa17199ba752874400a0ca6cd6e2196cdb7dc"},
{file = "jiter-0.10.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4d613e4b379a07d7c8453c5712ce7014e86c6ac93d990a0b8e7377e18505e98d"},
{file = "jiter-0.10.0-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:f62cf8ba0618eda841b9bf61797f21c5ebd15a7a1e19daab76e4e4b498d515b2"},
{file = "jiter-0.10.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:919d139cdfa8ae8945112398511cb7fca58a77382617d279556b344867a37e61"},
{file = "jiter-0.10.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:13ddbc6ae311175a3b03bd8994881bc4635c923754932918e18da841632349db"},
{file = "jiter-0.10.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4c440ea003ad10927a30521a9062ce10b5479592e8a70da27f21eeb457b4a9c5"},
{file = "jiter-0.10.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:dc347c87944983481e138dea467c0551080c86b9d21de6ea9306efb12ca8f606"},
{file = "jiter-0.10.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:13252b58c1f4d8c5b63ab103c03d909e8e1e7842d302473f482915d95fefd605"},
{file = "jiter-0.10.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:7d1bbf3c465de4a24ab12fb7766a0003f6f9bce48b8b6a886158c4d569452dc5"},
{file = "jiter-0.10.0-cp311-cp311-win32.whl", hash = "sha256:db16e4848b7e826edca4ccdd5b145939758dadf0dc06e7007ad0e9cfb5928ae7"},
{file = "jiter-0.10.0-cp311-cp311-win_amd64.whl", hash = "sha256:9c9c1d5f10e18909e993f9641f12fe1c77b3e9b533ee94ffa970acc14ded3812"},
{file = "jiter-0.10.0-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:1e274728e4a5345a6dde2d343c8da018b9d4bd4350f5a472fa91f66fda44911b"},
{file = "jiter-0.10.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:7202ae396446c988cb2a5feb33a543ab2165b786ac97f53b59aafb803fef0744"},
{file = "jiter-0.10.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:23ba7722d6748b6920ed02a8f1726fb4b33e0fd2f3f621816a8b486c66410ab2"},
{file = "jiter-0.10.0-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:371eab43c0a288537d30e1f0b193bc4eca90439fc08a022dd83e5e07500ed026"},
{file = "jiter-0.10.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6c675736059020365cebc845a820214765162728b51ab1e03a1b7b3abb70f74c"},
{file = "jiter-0.10.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0c5867d40ab716e4684858e4887489685968a47e3ba222e44cde6e4a2154f959"},
{file = "jiter-0.10.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:395bb9a26111b60141757d874d27fdea01b17e8fac958b91c20128ba8f4acc8a"},
{file = "jiter-0.10.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6842184aed5cdb07e0c7e20e5bdcfafe33515ee1741a6835353bb45fe5d1bd95"},
{file = "jiter-0.10.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:62755d1bcea9876770d4df713d82606c8c1a3dca88ff39046b85a048566d56ea"},
{file = "jiter-0.10.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:533efbce2cacec78d5ba73a41756beff8431dfa1694b6346ce7af3a12c42202b"},
{file = "jiter-0.10.0-cp312-cp312-win32.whl", hash = "sha256:8be921f0cadd245e981b964dfbcd6fd4bc4e254cdc069490416dd7a2632ecc01"},
{file = "jiter-0.10.0-cp312-cp312-win_amd64.whl", hash = "sha256:a7c7d785ae9dda68c2678532a5a1581347e9c15362ae9f6e68f3fdbfb64f2e49"},
{file = "jiter-0.10.0-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:e0588107ec8e11b6f5ef0e0d656fb2803ac6cf94a96b2b9fc675c0e3ab5e8644"},
{file = "jiter-0.10.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:cafc4628b616dc32530c20ee53d71589816cf385dd9449633e910d596b1f5c8a"},
{file = "jiter-0.10.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:520ef6d981172693786a49ff5b09eda72a42e539f14788124a07530f785c3ad6"},
{file = "jiter-0.10.0-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:554dedfd05937f8fc45d17ebdf298fe7e0c77458232bcb73d9fbbf4c6455f5b3"},
{file = "jiter-0.10.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5bc299da7789deacf95f64052d97f75c16d4fc8c4c214a22bf8d859a4288a1c2"},
{file = "jiter-0.10.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5161e201172de298a8a1baad95eb85db4fb90e902353b1f6a41d64ea64644e25"},
{file = "jiter-0.10.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2e2227db6ba93cb3e2bf67c87e594adde0609f146344e8207e8730364db27041"},
{file = "jiter-0.10.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:15acb267ea5e2c64515574b06a8bf393fbfee6a50eb1673614aa45f4613c0cca"},
{file = "jiter-0.10.0-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:901b92f2e2947dc6dfcb52fd624453862e16665ea909a08398dde19c0731b7f4"},
{file = "jiter-0.10.0-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:d0cb9a125d5a3ec971a094a845eadde2db0de85b33c9f13eb94a0c63d463879e"},
{file = "jiter-0.10.0-cp313-cp313-win32.whl", hash = "sha256:48a403277ad1ee208fb930bdf91745e4d2d6e47253eedc96e2559d1e6527006d"},
{file = "jiter-0.10.0-cp313-cp313-win_amd64.whl", hash = "sha256:75f9eb72ecb640619c29bf714e78c9c46c9c4eaafd644bf78577ede459f330d4"},
{file = "jiter-0.10.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:28ed2a4c05a1f32ef0e1d24c2611330219fed727dae01789f4a335617634b1ca"},
{file = "jiter-0.10.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:14a4c418b1ec86a195f1ca69da8b23e8926c752b685af665ce30777233dfe070"},
{file = "jiter-0.10.0-cp313-cp313t-win_amd64.whl", hash = "sha256:d7bfed2fe1fe0e4dda6ef682cee888ba444b21e7a6553e03252e4feb6cf0adca"},
{file = "jiter-0.10.0-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:5e9251a5e83fab8d87799d3e1a46cb4b7f2919b895c6f4483629ed2446f66522"},
{file = "jiter-0.10.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:023aa0204126fe5b87ccbcd75c8a0d0261b9abdbbf46d55e7ae9f8e22424eeb8"},
{file = "jiter-0.10.0-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3c189c4f1779c05f75fc17c0c1267594ed918996a231593a21a5ca5438445216"},
{file = "jiter-0.10.0-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:15720084d90d1098ca0229352607cd68256c76991f6b374af96f36920eae13c4"},
{file = "jiter-0.10.0-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e4f2fb68e5f1cfee30e2b2a09549a00683e0fde4c6a2ab88c94072fc33cb7426"},
{file = "jiter-0.10.0-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ce541693355fc6da424c08b7edf39a2895f58d6ea17d92cc2b168d20907dee12"},
{file = "jiter-0.10.0-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:31c50c40272e189d50006ad5c73883caabb73d4e9748a688b216e85a9a9ca3b9"},
{file = "jiter-0.10.0-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fa3402a2ff9815960e0372a47b75c76979d74402448509ccd49a275fa983ef8a"},
{file = "jiter-0.10.0-cp314-cp314-musllinux_1_1_aarch64.whl", hash = "sha256:1956f934dca32d7bb647ea21d06d93ca40868b505c228556d3373cbd255ce853"},
{file = "jiter-0.10.0-cp314-cp314-musllinux_1_1_x86_64.whl", hash = "sha256:fcedb049bdfc555e261d6f65a6abe1d5ad68825b7202ccb9692636c70fcced86"},
{file = "jiter-0.10.0-cp314-cp314-win32.whl", hash = "sha256:ac509f7eccca54b2a29daeb516fb95b6f0bd0d0d8084efaf8ed5dfc7b9f0b357"},
{file = "jiter-0.10.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:5ed975b83a2b8639356151cef5c0d597c68376fc4922b45d0eb384ac058cfa00"},
{file = "jiter-0.10.0-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3aa96f2abba33dc77f79b4cf791840230375f9534e5fac927ccceb58c5e604a5"},
{file = "jiter-0.10.0-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:bd6292a43c0fc09ce7c154ec0fa646a536b877d1e8f2f96c19707f65355b5a4d"},
{file = "jiter-0.10.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:39de429dcaeb6808d75ffe9effefe96a4903c6a4b376b2f6d08d77c1aaee2f18"},
{file = "jiter-0.10.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:52ce124f13a7a616fad3bb723f2bfb537d78239d1f7f219566dc52b6f2a9e48d"},
{file = "jiter-0.10.0-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:166f3606f11920f9a1746b2eea84fa2c0a5d50fd313c38bdea4edc072000b0af"},
{file = "jiter-0.10.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:28dcecbb4ba402916034fc14eba7709f250c4d24b0c43fc94d187ee0580af181"},
{file = "jiter-0.10.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:86c5aa6910f9bebcc7bc4f8bc461aff68504388b43bfe5e5c0bd21efa33b52f4"},
{file = "jiter-0.10.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ceeb52d242b315d7f1f74b441b6a167f78cea801ad7c11c36da77ff2d42e8a28"},
{file = "jiter-0.10.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ff76d8887c8c8ee1e772274fcf8cc1071c2c58590d13e33bd12d02dc9a560397"},
{file = "jiter-0.10.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:a9be4d0fa2b79f7222a88aa488bd89e2ae0a0a5b189462a12def6ece2faa45f1"},
{file = "jiter-0.10.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:9ab7fd8738094139b6c1ab1822d6f2000ebe41515c537235fd45dabe13ec9324"},
{file = "jiter-0.10.0-cp39-cp39-win32.whl", hash = "sha256:5f51e048540dd27f204ff4a87f5d79294ea0aa3aa552aca34934588cf27023cf"},
{file = "jiter-0.10.0-cp39-cp39-win_amd64.whl", hash = "sha256:1b28302349dc65703a9e4ead16f163b1c339efffbe1049c30a44b001a2a4fff9"},
{file = "jiter-0.10.0.tar.gz", hash = "sha256:07a7142c38aacc85194391108dc91b5b57093c978a9932bd86a36862759d9500"},
]
[[package]]
name = "jmespath"
version = "1.0.1"
@@ -3221,6 +3320,33 @@ rsa = ["cryptography (>=3.0.0)"]
signals = ["blinker (>=1.4.0)"]
signedtoken = ["cryptography (>=3.0.0)", "pyjwt (>=2.0.0,<3)"]
[[package]]
name = "openai"
version = "1.82.0"
description = "The official Python library for the openai API"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "openai-1.82.0-py3-none-any.whl", hash = "sha256:8c40647fea1816516cb3de5189775b30b5f4812777e40b8768f361f232b61b30"},
{file = "openai-1.82.0.tar.gz", hash = "sha256:b0a009b9a58662d598d07e91e4219ab4b1e3d8ba2db3f173896a92b9b874d1a7"},
]
[package.dependencies]
anyio = ">=3.5.0,<5"
distro = ">=1.7.0,<2"
httpx = ">=0.23.0,<1"
jiter = ">=0.4.0,<1"
pydantic = ">=1.9.0,<3"
sniffio = "*"
tqdm = ">4"
typing-extensions = ">=4.11,<5"
[package.extras]
datalib = ["numpy (>=1)", "pandas (>=1.2.3)", "pandas-stubs (>=1.1.0.11)"]
realtime = ["websockets (>=13,<16)"]
voice-helpers = ["numpy (>=2.0.2)", "sounddevice (>=0.5.1)"]
[[package]]
name = "opentelemetry-api"
version = "1.32.1"
@@ -4637,6 +4763,7 @@ files = [
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f66efbc1caa63c088dead1c4170d148eabc9b80d95fb75b6c92ac0aad2437d76"},
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:22353049ba4181685023b25b5b51a574bce33e7f51c759371a7422dcae5402a6"},
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:932205970b9f9991b34f55136be327501903f7c66830e9760a8ffb15b07f05cd"},
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:a52d48f4e7bf9005e8f0a89209bf9a73f7190ddf0489eee5eb51377385f59f2a"},
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-win32.whl", hash = "sha256:3eac5a91891ceb88138c113f9db04f3cebdae277f5d44eaa3651a4f573e6a5da"},
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-win_amd64.whl", hash = "sha256:ab007f2f5a87bd08ab1499bdf96f3d5c6ad4dcfa364884cb4549aa0154b13a28"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-macosx_13_0_arm64.whl", hash = "sha256:4a6679521a58256a90b0d89e03992c15144c5f3858f40d7c18886023d7943db6"},
@@ -4645,6 +4772,7 @@ files = [
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:811ea1594b8a0fb466172c384267a4e5e367298af6b228931f273b111f17ef52"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:cf12567a7b565cbf65d438dec6cfbe2917d3c1bdddfce84a9930b7d35ea59642"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:7dd5adc8b930b12c8fc5b99e2d535a09889941aa0d0bd06f4749e9a9397c71d2"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:1492a6051dab8d912fc2adeef0e8c72216b24d57bd896ea607cb90bb0c4981d3"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-win32.whl", hash = "sha256:bd0a08f0bab19093c54e18a14a10b4322e1eacc5217056f3c063bd2f59853ce4"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-win_amd64.whl", hash = "sha256:a274fb2cb086c7a3dea4322ec27f4cb5cc4b6298adb583ab0e211a4682f241eb"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:20b0f8dc160ba83b6dcc0e256846e1a02d044e13f7ea74a3d1d56ede4e48c632"},
@@ -4653,6 +4781,7 @@ files = [
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:749c16fcc4a2b09f28843cda5a193e0283e47454b63ec4b81eaa2242f50e4ccd"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:bf165fef1f223beae7333275156ab2022cffe255dcc51c27f066b4370da81e31"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:32621c177bbf782ca5a18ba4d7af0f1082a3f6e517ac2a18b3974d4edf349680"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:b82a7c94a498853aa0b272fd5bc67f29008da798d4f93a2f9f289feb8426a58d"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-win32.whl", hash = "sha256:e8c4ebfcfd57177b572e2040777b8abc537cdef58a2120e830124946aa9b42c5"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-win_amd64.whl", hash = "sha256:0467c5965282c62203273b838ae77c0d29d7638c8a4e3a1c8bdd3602c10904e4"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:4c8c5d82f50bb53986a5e02d1b3092b03622c02c2eb78e29bec33fd9593bae1a"},
@@ -4661,6 +4790,7 @@ files = [
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:96777d473c05ee3e5e3c3e999f5d23c6f4ec5b0c38c098b3a5229085f74236c6"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-musllinux_1_1_i686.whl", hash = "sha256:3bc2a80e6420ca8b7d3590791e2dfc709c88ab9152c00eeb511c9875ce5778bf"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:e188d2699864c11c36cdfdada94d781fd5d6b0071cd9c427bceb08ad3d7c70e1"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4f6f3eac23941b32afccc23081e1f50612bdbe4e982012ef4f5797986828cd01"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-win32.whl", hash = "sha256:6442cb36270b3afb1b4951f060eccca1ce49f3d087ca1ca4563a6eb479cb3de6"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-win_amd64.whl", hash = "sha256:e5b8daf27af0b90da7bb903a876477a9e6d7270be6146906b276605997c7e9a3"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-macosx_12_0_arm64.whl", hash = "sha256:fc4b630cd3fa2cf7fce38afa91d7cfe844a9f75d7f0f36393fa98815e911d987"},
@@ -4669,6 +4799,7 @@ files = [
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e2f1c3765db32be59d18ab3953f43ab62a761327aafc1594a2a1fbe038b8b8a7"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:d85252669dc32f98ebcd5d36768f5d4faeaeaa2d655ac0473be490ecdae3c285"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:e143ada795c341b56de9418c58d028989093ee611aa27ffb9b7f609c00d813ed"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:2c59aa6170b990d8d2719323e628aaf36f3bfbc1c26279c0eeeb24d05d2d11c7"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-win32.whl", hash = "sha256:beffaed67936fbbeffd10966a4eb53c402fafd3d6833770516bf7314bc6ffa12"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-win_amd64.whl", hash = "sha256:040ae85536960525ea62868b642bdb0c2cc6021c9f9d507810c0c604e66f5a7b"},
{file = "ruamel.yaml.clib-0.2.12.tar.gz", hash = "sha256:6c8fbb13ec503f99a91901ab46e0b07ae7941cd527393187039aec586fdfd36f"},
@@ -5050,7 +5181,7 @@ version = "4.67.1"
description = "Fast, Extensible Progress Meter"
optional = false
python-versions = ">=3.7"
groups = ["dev"]
groups = ["main", "dev"]
files = [
{file = "tqdm-4.67.1-py3-none-any.whl", hash = "sha256:26445eca388f82e72884e0d580d5464cd801a3ea01e63e5601bdff9ba6a48de2"},
{file = "tqdm-4.67.1.tar.gz", hash = "sha256:f8aef9c52c08c13a65f30ea34f4e5aac3fd1a34959879d7e59e63027286627f2"},
@@ -5483,4 +5614,4 @@ type = ["pytest-mypy"]
[metadata]
lock-version = "2.1"
python-versions = ">=3.11,<3.13"
content-hash = "051924735a7069c8393fefc18fc2c310b196ea24ad41b8c984dc5852683d0407"
content-hash = "bd227969f24255e5b641c79c727abe98a4aab73b2689b3e6f8cfa6fae8fa5bed"
+2 -1
View File
@@ -27,7 +27,8 @@ dependencies = [
"psycopg2-binary==2.9.9",
"pytest-celery[redis] (>=1.0.1,<2.0.0)",
"sentry-sdk[django] (>=2.20.0,<3.0.0)",
"uuid6==2024.7.10"
"uuid6==2024.7.10",
"openai (>=1.82.0,<2.0.0)"
]
description = "Prowler's API (Django/DRF)"
license = "Apache-2.0"
@@ -0,0 +1,82 @@
# Generated by Django 5.1.7 on 2025-04-10 14:54
import uuid
import django.core.validators
import django.db.models.deletion
from django.db import migrations, models
import api.rls
class Migration(migrations.Migration):
dependencies = [
("api", "0029_findings_check_index_parent"),
]
operations = [
migrations.CreateModel(
name="LighthouseConfig",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("inserted_at", models.DateTimeField(auto_now_add=True)),
("updated_at", models.DateTimeField(auto_now=True)),
(
"name",
models.CharField(
max_length=100,
validators=[django.core.validators.MinLengthValidator(3)],
),
),
("api_key", models.BinaryField()),
("model", models.CharField(default="gpt-4o", max_length=50)),
("temperature", models.FloatField(default=0.7)),
("max_tokens", models.IntegerField(default=4000)),
(
"business_context",
models.TextField(
blank=True,
help_text="Additional business context for this AI model configuration",
null=True,
),
),
("is_active", models.BooleanField(default=True)),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={
"db_table": "lighthouse_configurations",
"abstract": False,
"indexes": [
models.Index(fields=["name"], name="lighthouse_config_name_idx"),
models.Index(
fields=["is_active"], name="lighthouse_config_active_idx"
),
],
"constraints": [
models.UniqueConstraint(
fields=("tenant_id",),
name="unique_lighthouse_config_per_tenant",
),
],
},
),
migrations.AddConstraint(
model_name="lighthouseconfig",
constraint=api.rls.RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_lighthouseconfig",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
]
+131 -1
View File
@@ -1,8 +1,10 @@
import json
import logging
import re
from uuid import UUID, uuid4
from cryptography.fernet import Fernet
from config.custom_logging import BackendLogger
from cryptography.fernet import Fernet, InvalidToken
from django.conf import settings
from django.contrib.auth.models import AbstractBaseUser
from django.contrib.postgres.fields import ArrayField
@@ -49,6 +51,8 @@ fernet = Fernet(settings.SECRETS_ENCRYPTION_KEY.encode())
# Convert Prowler Severity enum to Django TextChoices
SeverityChoices = enum_to_choices(Severity)
logger = logging.getLogger(BackendLogger.API)
class StatusChoices(models.TextChoices):
"""
@@ -1408,3 +1412,129 @@ class ResourceScanSummary(RowLevelSecurityProtectedModel):
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]
class LighthouseConfiguration(RowLevelSecurityProtectedModel):
"""
Stores configuration and API keys for LLM services.
"""
class ModelChoices(models.TextChoices):
GPT_4O_2024_11_20 = "gpt-4o-2024-11-20", _("GPT-4o v2024-11-20")
GPT_4O_2024_08_06 = "gpt-4o-2024-08-06", _("GPT-4o v2024-08-06")
GPT_4O_2024_05_13 = "gpt-4o-2024-05-13", _("GPT-4o v2024-05-13")
GPT_4O = "gpt-4o", _("GPT-4o Default")
GPT_4O_MINI_2024_07_18 = "gpt-4o-mini-2024-07-18", _("GPT-4o Mini v2024-07-18")
GPT_4O_MINI = "gpt-4o-mini", _("GPT-4o Mini Default")
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
inserted_at = models.DateTimeField(auto_now_add=True, editable=False)
updated_at = models.DateTimeField(auto_now=True, editable=False)
name = models.CharField(
max_length=100,
validators=[MinLengthValidator(3)],
blank=False,
null=False,
help_text="Name of the configuration",
)
api_key = models.BinaryField(
blank=False, null=False, help_text="Encrypted API key for the LLM service"
)
model = models.CharField(
max_length=50,
choices=ModelChoices.choices,
blank=False,
null=False,
help_text="Must be one of the supported model names",
)
temperature = models.FloatField(default=0, help_text="Must be between 0 and 1")
max_tokens = models.IntegerField(
default=4000, help_text="Must be between 500 and 5000"
)
business_context = models.TextField(
blank=True,
null=False,
default="",
help_text="Additional business context for this AI model configuration",
)
is_active = models.BooleanField(default=True)
def __str__(self):
return self.name
def clean(self):
super().clean()
# Validate temperature
if not 0 <= self.temperature <= 1:
raise ModelValidationError(
detail="Temperature must be between 0 and 1",
code="invalid_temperature",
pointer="/data/attributes/temperature",
)
# Validate max_tokens
if not 500 <= self.max_tokens <= 5000:
raise ModelValidationError(
detail="Max tokens must be between 500 and 5000",
code="invalid_max_tokens",
pointer="/data/attributes/max_tokens",
)
@property
def api_key_decoded(self):
"""Return the decrypted API key, or None if unavailable or invalid."""
if not self.api_key:
return None
try:
decrypted_key = fernet.decrypt(bytes(self.api_key))
return decrypted_key.decode()
except InvalidToken:
logger.warning("Invalid token while decrypting API key.")
except Exception as e:
logger.exception("Unexpected error while decrypting API key: %s", e)
@api_key_decoded.setter
def api_key_decoded(self, value):
"""Store the encrypted API key."""
if not value:
raise ModelValidationError(
detail="API key is required",
code="invalid_api_key",
pointer="/data/attributes/api_key",
)
# Validate OpenAI API key format
openai_key_pattern = r"^sk-[\w-]+T3BlbkFJ[\w-]+$"
if not re.match(openai_key_pattern, value):
raise ModelValidationError(
detail="Invalid OpenAI API key format.",
code="invalid_api_key",
pointer="/data/attributes/api_key",
)
self.api_key = fernet.encrypt(value.encode())
def save(self, *args, **kwargs):
self.full_clean()
super().save(*args, **kwargs)
class Meta(RowLevelSecurityProtectedModel.Meta):
db_table = "lighthouse_configurations"
constraints = [
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
# Add unique constraint for name within a tenant
models.UniqueConstraint(
fields=["tenant_id"], name="unique_lighthouse_config_per_tenant"
),
]
class JSONAPIMeta:
resource_name = "lighthouse-configuration"
+334
View File
@@ -5525,3 +5525,337 @@ class TestIntegrationViewSet:
{f"filter[{filter_name}]": "whatever"},
)
assert response.status_code == status.HTTP_400_BAD_REQUEST
@pytest.mark.django_db
class TestLighthouseConfigViewSet:
@pytest.fixture
def valid_config_payload(self):
return {
"data": {
"type": "lighthouse-configuration",
"attributes": {
"name": "OpenAI",
"api_key": "sk-test1234567890T3BlbkFJtest1234567890",
"model": "gpt-4o",
"temperature": 0.7,
"max_tokens": 4000,
"business_context": "Test business context",
"is_active": True,
},
}
}
@pytest.fixture
def invalid_config_payload(self):
return {
"data": {
"type": "lighthouse-configuration",
"attributes": {
"name": "T", # Too short
"api_key": "invalid-key", # Invalid format
"model": "invalid-model",
"temperature": 2.0, # Invalid range
"max_tokens": -1, # Invalid value
},
}
}
def test_lighthouse_config_list(self, authenticated_client):
response = authenticated_client.get(reverse("lighthouseconfiguration-list"))
assert response.status_code == status.HTTP_200_OK
assert response.json()["data"] == []
def test_lighthouse_config_create(self, authenticated_client, valid_config_payload):
response = authenticated_client.post(
reverse("lighthouseconfiguration-list"),
data=valid_config_payload,
content_type=API_JSON_CONTENT_TYPE,
)
assert response.status_code == status.HTTP_201_CREATED
data = response.json()["data"]
assert (
data["attributes"]["name"]
== valid_config_payload["data"]["attributes"]["name"]
)
assert (
data["attributes"]["model"]
== valid_config_payload["data"]["attributes"]["model"]
)
assert (
data["attributes"]["temperature"]
== valid_config_payload["data"]["attributes"]["temperature"]
)
assert (
data["attributes"]["max_tokens"]
== valid_config_payload["data"]["attributes"]["max_tokens"]
)
assert (
data["attributes"]["business_context"]
== valid_config_payload["data"]["attributes"]["business_context"]
)
assert (
data["attributes"]["is_active"]
== valid_config_payload["data"]["attributes"]["is_active"]
)
# Check that API key is masked with asterisks only
masked_api_key = data["attributes"]["api_key"]
assert all(
c == "*" for c in masked_api_key
), "API key should contain only asterisks"
def test_lighthouse_config_create_invalid_name_too_short(
self, authenticated_client, valid_config_payload
):
"""Test that name validation fails when too short"""
payload = valid_config_payload.copy()
payload["data"]["attributes"]["name"] = "T" # Too short
response = authenticated_client.post(
reverse("lighthouseconfiguration-list"),
data=payload,
content_type=API_JSON_CONTENT_TYPE,
)
assert response.status_code == status.HTTP_400_BAD_REQUEST
errors = response.json()["errors"]
assert any("name" in error["source"]["pointer"] for error in errors)
def test_lighthouse_config_create_invalid_api_key_format(
self, authenticated_client, valid_config_payload
):
"""Test that API key validation fails with invalid format"""
payload = valid_config_payload.copy()
payload["data"]["attributes"]["api_key"] = "invalid-key"
response = authenticated_client.post(
reverse("lighthouseconfiguration-list"),
data=payload,
content_type=API_JSON_CONTENT_TYPE,
)
assert response.status_code == status.HTTP_400_BAD_REQUEST
errors = response.json()["errors"]
assert any(
"Invalid OpenAI API key format." in error["detail"] for error in errors
)
def test_lighthouse_config_create_invalid_model(
self, authenticated_client, valid_config_payload
):
"""Test that model validation fails with invalid model name"""
payload = valid_config_payload.copy()
payload["data"]["attributes"]["model"] = "invalid-model"
response = authenticated_client.post(
reverse("lighthouseconfiguration-list"),
data=payload,
content_type=API_JSON_CONTENT_TYPE,
)
assert response.status_code == status.HTTP_400_BAD_REQUEST
errors = response.json()["errors"]
assert any("model" in error["source"]["pointer"] for error in errors)
def test_lighthouse_config_create_invalid_temperature_range(
self, authenticated_client, valid_config_payload
):
"""Test that temperature validation fails when out of range"""
payload = valid_config_payload.copy()
payload["data"]["attributes"]["temperature"] = 2.0 # Out of range
response = authenticated_client.post(
reverse("lighthouseconfiguration-list"),
data=payload,
content_type=API_JSON_CONTENT_TYPE,
)
assert response.status_code == status.HTTP_400_BAD_REQUEST
errors = response.json()["errors"]
assert any("temperature" in error["source"]["pointer"] for error in errors)
def test_lighthouse_config_create_invalid_max_tokens(
self, authenticated_client, valid_config_payload
):
"""Test that max_tokens validation fails with invalid value"""
payload = valid_config_payload.copy()
payload["data"]["attributes"]["max_tokens"] = -1 # Invalid value
response = authenticated_client.post(
reverse("lighthouseconfiguration-list"),
data=payload,
content_type=API_JSON_CONTENT_TYPE,
)
assert response.status_code == status.HTTP_400_BAD_REQUEST
errors = response.json()["errors"]
assert any("max_tokens" in error["source"]["pointer"] for error in errors)
def test_lighthouse_config_create_missing_required_fields(
self, authenticated_client
):
"""Test that validation fails when required fields are missing"""
payload = {"data": {"type": "lighthouse-configuration", "attributes": {}}}
response = authenticated_client.post(
reverse("lighthouseconfiguration-list"),
data=payload,
content_type=API_JSON_CONTENT_TYPE,
)
assert response.status_code == status.HTTP_400_BAD_REQUEST
errors = response.json()["errors"]
# Check for required fields
required_fields = ["name", "api_key", "model"]
for field in required_fields:
assert any(field in error["source"]["pointer"] for error in errors)
def test_lighthouse_config_create_duplicate(
self, authenticated_client, valid_config_payload
):
# Create first config
response = authenticated_client.post(
reverse("lighthouseconfiguration-list"),
data=valid_config_payload,
content_type=API_JSON_CONTENT_TYPE,
)
assert response.status_code == status.HTTP_201_CREATED
# Try to create second config for same tenant
response = authenticated_client.post(
reverse("lighthouseconfiguration-list"),
data=valid_config_payload,
content_type=API_JSON_CONTENT_TYPE,
)
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert (
"Lighthouse configuration already exists for this tenant"
in response.json()["errors"][0]["detail"]
)
def test_lighthouse_config_retrieve(
self, authenticated_client, lighthouse_config_fixture
):
"""Test retrieving a lighthouse config"""
response = authenticated_client.get(
reverse(
"lighthouseconfiguration-detail",
kwargs={"pk": lighthouse_config_fixture.id},
)
)
assert response.status_code == status.HTTP_200_OK
data = response.json()["data"]
assert data["attributes"]["name"] == lighthouse_config_fixture.name
assert data["attributes"]["api_key"] == "*" * len(
lighthouse_config_fixture.api_key
)
def test_lighthouse_config_update(
self, authenticated_client, lighthouse_config_fixture
):
update_payload = {
"data": {
"type": "lighthouse-configuration",
"id": str(lighthouse_config_fixture.id),
"attributes": {
"name": "Updated Config",
"model": "gpt-4o-mini",
"temperature": 0.5,
},
}
}
response = authenticated_client.patch(
reverse(
"lighthouseconfiguration-detail",
kwargs={"pk": lighthouse_config_fixture.id},
),
data=update_payload,
content_type=API_JSON_CONTENT_TYPE,
)
assert response.status_code == status.HTTP_200_OK
data = response.json()["data"]
assert data["attributes"]["name"] == "Updated Config"
assert data["attributes"]["model"] == "gpt-4o-mini"
assert data["attributes"]["temperature"] == 0.5
def test_lighthouse_config_delete(
self, authenticated_client, lighthouse_config_fixture
):
config_id = lighthouse_config_fixture.id
response = authenticated_client.delete(
reverse("lighthouseconfiguration-detail", kwargs={"pk": config_id})
)
assert response.status_code == status.HTTP_204_NO_CONTENT
# Verify deletion
response = authenticated_client.get(
reverse("lighthouseconfiguration-detail", kwargs={"pk": config_id})
)
assert response.status_code == status.HTTP_404_NOT_FOUND
@patch("api.v1.views.openai.OpenAI")
def test_lighthouse_config_check_connection(
self, mock_openai, authenticated_client, lighthouse_config_fixture
):
config_id = lighthouse_config_fixture.id
# Mock successful API call
mock_client = Mock()
mock_client.models.list.return_value = Mock(
data=[Mock(id="gpt-4o"), Mock(id="gpt-4o-mini")]
)
mock_openai.return_value = mock_client
# Check connection
response = authenticated_client.get(
reverse(
"lighthouseconfiguration-check-connection", kwargs={"pk": config_id}
)
)
assert response.status_code == status.HTTP_200_OK
assert response.json()["data"]["detail"] == "Connection successful!"
assert "gpt-4o" in response.json()["data"]["available_models"]
assert "gpt-4o-mini" in response.json()["data"]["available_models"]
def test_lighthouse_config_get_key(
self, authenticated_client, lighthouse_config_fixture, valid_config_payload
):
config_id = lighthouse_config_fixture.id
expected_api_key = valid_config_payload["data"]["attributes"]["api_key"]
response = authenticated_client.get(
reverse("lighthouseconfiguration-detail", kwargs={"pk": config_id})
+ "?fields[lighthouse-config]=api_key"
)
assert response.status_code == status.HTTP_200_OK
assert response.json()["data"]["attributes"]["api_key"] == expected_api_key
def test_lighthouse_config_filters(
self, authenticated_client, lighthouse_config_fixture
):
# Test name filter
response = authenticated_client.get(
reverse("lighthouseconfiguration-list")
+ "?filter[name]="
+ lighthouse_config_fixture.name
)
assert response.status_code == status.HTTP_200_OK
assert len(response.json()["data"]) == 1
# Test model filter
response = authenticated_client.get(
reverse("lighthouseconfiguration-list") + "?filter[model]=gpt-4o"
)
assert response.status_code == status.HTTP_200_OK
assert len(response.json()["data"]) == 1
# Test is_active filter
response = authenticated_client.get(
reverse("lighthouseconfiguration-list") + "?filter[is_active]=true"
)
assert response.status_code == status.HTTP_200_OK
assert len(response.json()["data"]) == 1
def test_lighthouse_config_sorting(
self, authenticated_client, lighthouse_config_fixture
):
# Test sorting by name
response = authenticated_client.get(
reverse("lighthouseconfiguration-list") + "?sort=name"
)
assert response.status_code == status.HTTP_200_OK
assert len(response.json()["data"]) == 1
# Test sorting by inserted_at
response = authenticated_client.get(
reverse("lighthouseconfiguration-list") + "?sort=-inserted_at"
)
assert response.status_code == status.HTTP_200_OK
assert len(response.json()["data"]) == 1
+136
View File
@@ -1,4 +1,5 @@
import json
import re
from datetime import datetime, timedelta, timezone
from django.conf import settings
@@ -19,6 +20,7 @@ from api.models import (
IntegrationProviderRelationship,
Invitation,
InvitationRoleRelationship,
LighthouseConfiguration,
Membership,
Provider,
ProviderGroup,
@@ -2059,3 +2061,137 @@ class IntegrationUpdateSerializer(BaseWriteIntegrationSerializer):
IntegrationProviderRelationship.objects.bulk_create(new_relationships)
return super().update(instance, validated_data)
class LighthouseConfigSerializer(RLSSerializer):
"""
Serializer for the LighthouseConfig model.
"""
api_key = serializers.CharField(required=False)
class Meta:
model = LighthouseConfiguration
fields = [
"id",
"name",
"api_key",
"model",
"temperature",
"max_tokens",
"business_context",
"is_active",
"inserted_at",
"updated_at",
"url",
]
extra_kwargs = {
"id": {"read_only": True},
"inserted_at": {"read_only": True},
"updated_at": {"read_only": True},
}
def to_representation(self, instance):
data = super().to_representation(instance)
# Check if api_key is specifically requested in fields param
fields_param = self.context.get("request", None) and self.context[
"request"
].query_params.get("fields[lighthouse-config]", "")
if fields_param == "api_key":
# Return decrypted key if specifically requested
data["api_key"] = instance.api_key_decoded if instance.api_key else None
else:
# Return masked key for general requests
data["api_key"] = "*" * len(instance.api_key) if instance.api_key else None
return data
class LighthouseConfigCreateSerializer(RLSSerializer, BaseWriteSerializer):
"""Serializer for creating new Lighthouse configurations."""
api_key = serializers.CharField(write_only=True, required=True)
class Meta:
model = LighthouseConfiguration
fields = [
"id",
"name",
"api_key",
"model",
"temperature",
"max_tokens",
"business_context",
"is_active",
"inserted_at",
"updated_at",
]
extra_kwargs = {
"id": {"read_only": True},
"inserted_at": {"read_only": True},
"updated_at": {"read_only": True},
}
def validate(self, attrs):
tenant_id = self.context.get("request").tenant_id
if LighthouseConfiguration.objects.filter(tenant_id=tenant_id).exists():
raise serializers.ValidationError(
{
"tenant_id": "Lighthouse configuration already exists for this tenant."
}
)
return super().validate(attrs)
def create(self, validated_data):
api_key = validated_data.pop("api_key")
# Validate API key format before creating the instance
if not api_key:
raise serializers.ValidationError({"api_key": "API key is required"})
# Validate OpenAI API key format
openai_key_pattern = r"^sk-[\w-]+T3BlbkFJ[\w-]+$"
if not re.match(openai_key_pattern, api_key):
raise serializers.ValidationError(
{"api_key": "Invalid OpenAI API key format."}
)
instance = super().create(validated_data)
instance.api_key_decoded = api_key
instance.save()
return instance
class LighthouseConfigUpdateSerializer(BaseWriteSerializer):
"""
Serializer for updating LighthouseConfig instances.
"""
api_key = serializers.CharField(write_only=True, required=False)
class Meta:
model = LighthouseConfiguration
fields = [
"id",
"name",
"api_key",
"model",
"temperature",
"max_tokens",
"business_context",
"is_active",
]
extra_kwargs = {
"id": {"read_only": True},
"name": {"required": False},
"model": {"required": False},
"temperature": {"required": False},
"max_tokens": {"required": False},
}
def update(self, instance, validated_data):
api_key = validated_data.pop("api_key", None)
instance = super().update(instance, validated_data)
if api_key:
instance.api_key_decoded = api_key
instance.save()
return instance
+6
View File
@@ -13,6 +13,7 @@ from api.v1.views import (
IntegrationViewSet,
InvitationAcceptViewSet,
InvitationViewSet,
LighthouseConfigViewSet,
MembershipViewSet,
OverviewViewSet,
ProviderGroupProvidersRelationshipView,
@@ -49,6 +50,11 @@ router.register(
router.register(r"overviews", OverviewViewSet, basename="overview")
router.register(r"schedules", ScheduleViewSet, basename="schedule")
router.register(r"integrations", IntegrationViewSet, basename="integration")
router.register(
r"lighthouse-configuration",
LighthouseConfigViewSet,
basename="lighthouseconfiguration",
)
tenants_router = routers.NestedSimpleRouter(router, r"tenants", lookup="tenant")
tenants_router.register(
+107
View File
@@ -2,6 +2,7 @@ import glob
import os
from datetime import datetime, timedelta, timezone
import openai
import sentry_sdk
from allauth.socialaccount.providers.github.views import GitHubOAuth2Adapter
from allauth.socialaccount.providers.google.views import GoogleOAuth2Adapter
@@ -89,6 +90,7 @@ from api.models import (
Finding,
Integration,
Invitation,
LighthouseConfiguration,
Membership,
Provider,
ProviderGroup,
@@ -132,6 +134,9 @@ from api.v1.serializers import (
InvitationCreateSerializer,
InvitationSerializer,
InvitationUpdateSerializer,
LighthouseConfigCreateSerializer,
LighthouseConfigSerializer,
LighthouseConfigUpdateSerializer,
MembershipSerializer,
OverviewFindingSerializer,
OverviewProviderSerializer,
@@ -3208,3 +3213,105 @@ class IntegrationViewSet(BaseRLSViewSet):
context = super().get_serializer_context()
context["allowed_providers"] = self.allowed_providers
return context
@extend_schema_view(
list=extend_schema(
tags=["Lighthouse"],
summary="List all Lighthouse configurations",
description="Retrieve a list of all Lighthouse configurations.",
),
create=extend_schema(
tags=["Lighthouse"],
summary="Create a new Lighthouse configuration",
description="Create a new Lighthouse configuration with the specified details.",
),
partial_update=extend_schema(
tags=["Lighthouse"],
summary="Partially update a Lighthouse configuration",
description="Update certain fields of an existing Lighthouse configuration.",
),
retrieve=extend_schema(
tags=["Lighthouse"],
summary="Retrieve a Lighthouse configuration",
description="Fetch detailed information about a specific Lighthouse configuration by its ID. Add query param `fields[lighthouse-config]=api_key` to get API key.",
),
destroy=extend_schema(
tags=["Lighthouse"],
summary="Delete a Lighthouse configuration",
description="Remove a Lighthouse configuration by its ID.",
),
check_connection=extend_schema(
tags=["Lighthouse"],
summary="Check the connection to the OpenAI API",
description="Verify the connection to the OpenAI API for a specific Lighthouse configuration.",
),
)
class LighthouseConfigViewSet(BaseRLSViewSet):
"""
API endpoint for managing Lighthouse configuration.
"""
filterset_fields = {
"name": ["exact", "icontains"],
"model": ["exact", "icontains"],
"is_active": ["exact"],
"inserted_at": ["gte", "lte"],
}
ordering_fields = ["name", "inserted_at", "updated_at", "is_active"]
ordering = ["-inserted_at"]
def get_queryset(self):
return LighthouseConfiguration.objects.filter(tenant_id=self.request.tenant_id)
def get_serializer_class(self):
if self.action == "create":
return LighthouseConfigCreateSerializer
elif self.action == "partial_update":
return LighthouseConfigUpdateSerializer
return LighthouseConfigSerializer
@action(detail=True, methods=["get"])
def check_connection(self, request, pk=None):
"""
Check the connection to the OpenAI API.
"""
instance = self.get_object()
if not instance.api_key_decoded:
return Response(
{"detail": "API key is invalid or missing."},
status=status.HTTP_400_BAD_REQUEST,
)
try:
client = openai.OpenAI(
api_key=instance.api_key_decoded,
)
models = client.models.list()
return Response(
{
"detail": "Connection successful!",
"available_models": [model.id for model in models.data],
},
status=status.HTTP_200_OK,
)
except Exception as e:
return Response(
{"detail": f"Connection failed: {str(e)}"},
status=status.HTTP_400_BAD_REQUEST,
)
def create(self, request, *args, **kwargs):
"""Create new Lighthouse configuration"""
serializer = self.get_serializer(data=request.data)
serializer.is_valid(raise_exception=True)
instance = serializer.save()
headers = self.get_success_headers(serializer.data)
return Response(
LighthouseConfigSerializer(
instance, context=self.get_serializer_context()
).data,
status=status.HTTP_201_CREATED,
headers=headers,
)
+15
View File
@@ -20,6 +20,7 @@ from api.models import (
Integration,
IntegrationProviderRelationship,
Invitation,
LighthouseConfiguration,
Membership,
Provider,
ProviderGroup,
@@ -1039,6 +1040,20 @@ def backfill_scan_metadata_fixture(scans_fixture, findings_fixture):
backfill_resource_scan_summaries(tenant_id=tenant_id, scan_id=scan_id)
@pytest.fixture
def lighthouse_config_fixture(authenticated_client, tenants_fixture):
return LighthouseConfiguration.objects.create(
tenant_id=tenants_fixture[0].id,
name="OpenAI",
api_key_decoded="sk-test1234567890T3BlbkFJtest1234567890",
model="gpt-4o",
temperature=0,
max_tokens=4000,
business_context="Test business context",
is_active=True,
)
@pytest.fixture(scope="function")
def latest_scan_finding(authenticated_client, providers_fixture, resources_fixture):
provider = providers_fixture[0]
+11 -11
View File
@@ -4,9 +4,9 @@ services:
build:
context: ./api
dockerfile: Dockerfile
target: dev
# target: dev
environment:
- DJANGO_SETTINGS_MODULE=config.django.devel
- DJANGO_SETTINGS_MODULE=config.django.production
- DJANGO_LOGGING_FORMATTER=${LOGGING_FORMATTER:-human_readable}
env_file:
- path: .env
@@ -24,21 +24,21 @@ services:
condition: service_healthy
entrypoint:
- "/home/prowler/docker-entrypoint.sh"
- "dev"
- "prod"
ui-dev:
build:
context: ./ui
dockerfile: Dockerfile
target: dev
target: prod
env_file:
- path: .env
required: false
ports:
- 3000:3000
volumes:
- "./ui:/app"
- "/app/node_modules"
# volumes:
# - "./ui:/app"
# - "/app/node_modules"
postgres:
image: postgres:16.3-alpine3.20
@@ -80,9 +80,9 @@ services:
build:
context: ./api
dockerfile: Dockerfile
target: dev
# target: dev
environment:
- DJANGO_SETTINGS_MODULE=config.django.devel
- DJANGO_SETTINGS_MODULE=config.django.production
env_file:
- path: .env
required: false
@@ -101,9 +101,9 @@ services:
build:
context: ./api
dockerfile: Dockerfile
target: dev
# target: dev
environment:
- DJANGO_SETTINGS_MODULE=config.django.devel
- DJANGO_SETTINGS_MODULE=config.django.production
env_file:
- path: ./.env
required: false
Generated
+1150 -126
View File
File diff suppressed because it is too large Load Diff
+49 -35
View File
@@ -98,6 +98,7 @@ from prowler.providers.common.provider import Provider
from prowler.providers.common.quick_inventory import run_provider_quick_inventory
from prowler.providers.gcp.models import GCPOutputOptions
from prowler.providers.github.models import GithubOutputOptions
from prowler.providers.iac.models import IACOutputOptions
from prowler.providers.kubernetes.models import KubernetesOutputOptions
from prowler.providers.m365.models import M365OutputOptions
from prowler.providers.nhn.models import NHNOutputOptions
@@ -175,11 +176,13 @@ def prowler():
# Load compliance frameworks
logger.debug("Loading compliance frameworks from .json files")
bulk_compliance_frameworks = Compliance.get_bulk(provider)
# Complete checks metadata with the compliance framework specification
bulk_checks_metadata = update_checks_metadata_with_compliance(
bulk_compliance_frameworks, bulk_checks_metadata
)
# Skip compliance frameworks for IAC provider
if provider != "iac":
bulk_compliance_frameworks = Compliance.get_bulk(provider)
# Complete checks metadata with the compliance framework specification
bulk_checks_metadata = update_checks_metadata_with_compliance(
bulk_compliance_frameworks, bulk_checks_metadata
)
# Update checks metadata if the --custom-checks-metadata-file is present
custom_checks_metadata = None
@@ -231,39 +234,45 @@ def prowler():
if not args.only_logs:
global_provider.print_credentials()
# Import custom checks from folder
if checks_folder:
custom_checks = parse_checks_from_folder(global_provider, checks_folder)
# Workaround to be able to execute custom checks alongside all checks if nothing is explicitly set
if (
not checks_file
and not checks
and not services
and not severities
and not compliance_framework
and not categories
):
checks_to_execute.update(custom_checks)
# Skip service and check loading for IAC provider
if provider != "iac":
# Import custom checks from folder
if checks_folder:
custom_checks = parse_checks_from_folder(global_provider, checks_folder)
# Workaround to be able to execute custom checks alongside all checks if nothing is explicitly set
if (
not checks_file
and not checks
and not services
and not severities
and not compliance_framework
and not categories
):
checks_to_execute.update(custom_checks)
# Exclude checks if -e/--excluded-checks
if excluded_checks:
checks_to_execute = exclude_checks_to_run(checks_to_execute, excluded_checks)
# Exclude checks if -e/--excluded-checks
if excluded_checks:
checks_to_execute = exclude_checks_to_run(
checks_to_execute, excluded_checks
)
# Exclude services if --excluded-services
if excluded_services:
checks_to_execute = exclude_services_to_run(
checks_to_execute, excluded_services, provider
# Exclude services if --excluded-services
if excluded_services:
checks_to_execute = exclude_services_to_run(
checks_to_execute, excluded_services, provider
)
# Once the provider is set and we have the eventual checks based on the resource identifier,
# it is time to check what Prowler's checks are going to be executed
checks_from_resources = (
global_provider.get_checks_to_execute_by_audit_resources()
)
# Intersect checks from resources with checks to execute so we only run the checks that apply to the resources with the specified ARNs or tags
if getattr(args, "resource_arn", None) or getattr(args, "resource_tag", None):
checks_to_execute = checks_to_execute.intersection(checks_from_resources)
# Once the provider is set and we have the eventual checks based on the resource identifier,
# it is time to check what Prowler's checks are going to be executed
checks_from_resources = global_provider.get_checks_to_execute_by_audit_resources()
# Intersect checks from resources with checks to execute so we only run the checks that apply to the resources with the specified ARNs or tags
if getattr(args, "resource_arn", None) or getattr(args, "resource_tag", None):
checks_to_execute = checks_to_execute.intersection(checks_from_resources)
# Sort final check list
checks_to_execute = sorted(checks_to_execute)
# Sort final check list
checks_to_execute = sorted(checks_to_execute)
# Setup Output Options
if provider == "aws":
@@ -294,6 +303,8 @@ def prowler():
output_options = NHNOutputOptions(
args, bulk_checks_metadata, global_provider.identity
)
elif provider == "iac":
output_options = IACOutputOptions(args, bulk_checks_metadata)
# Run the quick inventory for the provider if available
if hasattr(args, "quick_inventory") and args.quick_inventory:
@@ -303,7 +314,10 @@ def prowler():
# Execute checks
findings = []
if len(checks_to_execute):
if provider == "iac":
# For IAC provider, run the scan directly
findings = global_provider.run()
elif len(checks_to_execute):
findings = execute_checks(
checks_to_execute,
global_provider,
View File
File diff suppressed because it is too large Load Diff
File diff suppressed because it is too large Load Diff
File diff suppressed because it is too large Load Diff
+1
View File
@@ -30,6 +30,7 @@ class Provider(str, Enum):
KUBERNETES = "kubernetes"
M365 = "m365"
GITHUB = "github"
IAC = "iac"
NHN = "nhn"
+4
View File
@@ -20,6 +20,10 @@ def load_checks_to_execute(
) -> set:
"""Generate the list of checks to execute based on the cloud provider and the input arguments given"""
try:
# Bypass check loading for IAC provider since it uses Checkov directly
if provider == "iac":
return set()
# Local subsets
checks_to_execute = set()
check_aliases = {}
+22 -18
View File
@@ -3,7 +3,7 @@ import sys
from enum import Enum
from typing import Optional, Union
from pydantic import BaseModel, ValidationError, root_validator
from pydantic.v1 import BaseModel, ValidationError, root_validator
from prowler.lib.check.utils import list_compliance_modules
from prowler.lib.logger import logger
@@ -56,22 +56,26 @@ class ENS_Requirement_Attribute(BaseModel):
class Generic_Compliance_Requirement_Attribute(BaseModel):
"""Generic Compliance Requirement Attribute"""
ItemId: Optional[str]
Section: Optional[str]
SubSection: Optional[str]
SubGroup: Optional[str]
Service: Optional[str]
Type: Optional[str]
ItemId: Optional[str] = None
Section: Optional[str] = None
SubSection: Optional[str] = None
SubGroup: Optional[str] = None
Service: Optional[str] = None
Type: Optional[str] = None
class CIS_Requirement_Attribute_Profile(str):
class CIS_Requirement_Attribute_Profile(str, Enum):
"""CIS Requirement Attribute Profile"""
Level_1 = "Level 1"
Level_2 = "Level 2"
E3_Level_1 = "E3 Level 1"
E3_Level_2 = "E3 Level 2"
E5_Level_1 = "E5 Level 1"
E5_Level_2 = "E5 Level 2"
class CIS_Requirement_Attribute_AssessmentStatus(str):
class CIS_Requirement_Attribute_AssessmentStatus(str, Enum):
"""CIS Requirement Attribute Assessment Status"""
Manual = "Manual"
@@ -83,7 +87,7 @@ class CIS_Requirement_Attribute(BaseModel):
"""CIS Requirement Attribute"""
Section: str
SubSection: Optional[str]
SubSection: Optional[str] = None
Profile: CIS_Requirement_Attribute_Profile
AssessmentStatus: CIS_Requirement_Attribute_AssessmentStatus
Description: str
@@ -92,7 +96,7 @@ class CIS_Requirement_Attribute(BaseModel):
RemediationProcedure: str
AuditProcedure: str
AdditionalInformation: str
DefaultValue: Optional[str]
DefaultValue: Optional[str] = None
References: str
@@ -104,7 +108,7 @@ class AWS_Well_Architected_Requirement_Attribute(BaseModel):
WellArchitectedQuestionId: str
WellArchitectedPracticeId: str
Section: str
SubSection: Optional[str]
SubSection: Optional[str] = None
LevelOfRisk: str
AssessmentMethod: str
Description: str
@@ -177,10 +181,10 @@ class KISA_ISMSP_Requirement_Attribute(BaseModel):
Domain: str
Subdomain: str
Section: str
AuditChecklist: Optional[list[str]]
RelatedRegulations: Optional[list[str]]
AuditEvidence: Optional[list[str]]
NonComplianceCases: Optional[list[str]]
AuditChecklist: Optional[list[str]] = None
RelatedRegulations: Optional[list[str]] = None
AuditEvidence: Optional[list[str]] = None
NonComplianceCases: Optional[list[str]] = None
# Prowler ThreatScore Requirement Attribute
@@ -203,7 +207,7 @@ class Compliance_Requirement(BaseModel):
Id: str
Description: str
Name: Optional[str]
Name: Optional[str] = None
Attributes: list[
Union[
CIS_Requirement_Attribute,
@@ -224,7 +228,7 @@ class Compliance(BaseModel):
Framework: str
Provider: str
Version: Optional[str]
Version: Optional[str] = None
Description: str
Requirements: list[
Union[
+26 -3
View File
@@ -5,9 +5,9 @@ import sys
from abc import ABC, abstractmethod
from dataclasses import asdict, dataclass, is_dataclass
from enum import Enum
from typing import Any, Dict, Set
from typing import Any, Dict, Optional, Set
from pydantic import BaseModel, ValidationError, validator
from pydantic.v1 import BaseModel, ValidationError, validator
from prowler.config.config import Provider
from prowler.lib.check.compliance_models import Compliance
@@ -118,7 +118,7 @@ class CheckMetadata(BaseModel):
Notes: str
# We set the compliance to None to
# store the compliance later if supplied
Compliance: list = None
Compliance: Optional[list[Any]] = []
@validator("Categories", each_item=True, pre=True, always=True)
def valid_category(value):
@@ -607,6 +607,29 @@ class CheckReportM365(Check_Report):
self.location = resource_location
@dataclass
class CheckReportIAC(Check_Report):
"""Contains the IAC Check's finding information using Checkov."""
resource_name: str
resource_path: str
resource_line_range: str
def __init__(self, metadata: dict = {}, finding: dict = {}) -> None:
"""
Initialize the IAC Check's finding information from a Checkov failed_check dict.
Args:
metadata (Dict): Optional check metadata (can be None).
failed_check (dict): A single failed_check result from Checkov's JSON output.
"""
super().__init__(metadata, finding)
self.resource_name = getattr(finding, "resource", "")
self.resource_path = getattr(finding, "file_path", "")
self.resource_line_range = getattr(finding, "file_line_range", "")
@dataclass
class CheckReportNHN(Check_Report):
"""Contains the NHN Check's finding information."""
+8
View File
@@ -14,6 +14,10 @@ def recover_checks_from_provider(
Returns a list of tuples with the following format (check_name, check_path)
"""
try:
# Bypass check loading for IAC provider since it uses Checkov directly
if provider == "iac":
return []
checks = []
modules = list_modules(provider, service)
for module_name in modules:
@@ -59,6 +63,10 @@ def recover_checks_from_service(service_list: list, provider: str) -> set:
Returns a set of checks from the given services
"""
try:
# Bypass check loading for IAC provider since it uses Checkov directly
if provider == "iac":
return set()
checks = set()
service_list = [
"awslambda" if service == "lambda" else service for service in service_list
+3 -2
View File
@@ -26,16 +26,17 @@ class ProwlerArgumentParser:
self.parser = argparse.ArgumentParser(
prog="prowler",
formatter_class=RawTextHelpFormatter,
usage="prowler [-h] [--version] {aws,azure,gcp,kubernetes,m365,nhn,dashboard} ...",
usage="prowler [-h] [--version] {aws,azure,gcp,kubernetes,m365,nhn,dashboard,iac} ...",
epilog="""
Available Cloud Providers:
{aws,azure,gcp,kubernetes,m365,nhn}
{aws,azure,gcp,kubernetes,m365,nhn,iac}
aws AWS Provider
azure Azure Provider
gcp GCP Provider
kubernetes Kubernetes Provider
github GitHub Provider
m365 Microsoft 365 Provider
iac IaC Provider (Preview)
nhn NHN Provider (Unofficial)
Available components:
+2 -2
View File
@@ -2,7 +2,7 @@ from json import dump
from os import SEEK_SET
from typing import Optional
from pydantic import BaseModel, validator
from pydantic.v1 import BaseModel, validator
from prowler.config.config import prowler_version, timestamp_utc
from prowler.lib.logger import logger
@@ -279,7 +279,7 @@ class Resource(BaseModel):
Id: str
Partition: str
Region: str
Tags: Optional[dict]
Tags: Optional[dict] = None
@validator("Tags", pre=True, always=True)
def tags_cannot_be_empty_dict(tags):
@@ -1,6 +1,6 @@
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
class AWSWellArchitectedModel(BaseModel):
@@ -19,7 +19,7 @@ class AWSWellArchitectedModel(BaseModel):
Requirements_Attributes_WellArchitectedQuestionId: str
Requirements_Attributes_WellArchitectedPracticeId: str
Requirements_Attributes_Section: str
Requirements_Attributes_SubSection: Optional[str]
Requirements_Attributes_SubSection: Optional[str] = None
Requirements_Attributes_LevelOfRisk: str
Requirements_Attributes_AssessmentMethod: str
Requirements_Attributes_Description: str
+10 -10
View File
@@ -1,6 +1,6 @@
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
class AWSCISModel(BaseModel):
@@ -16,7 +16,7 @@ class AWSCISModel(BaseModel):
Requirements_Id: str
Requirements_Description: str
Requirements_Attributes_Section: str
Requirements_Attributes_SubSection: Optional[str]
Requirements_Attributes_SubSection: Optional[str] = None
Requirements_Attributes_Profile: str
Requirements_Attributes_AssessmentStatus: str
Requirements_Attributes_Description: str
@@ -25,9 +25,9 @@ class AWSCISModel(BaseModel):
Requirements_Attributes_RemediationProcedure: str
Requirements_Attributes_AuditProcedure: str
Requirements_Attributes_AdditionalInformation: str
Requirements_Attributes_DefaultValue: Optional[
str
] # TODO Optional for now since it's not present in the CIS 1.5, 2.0 and 3.0 AWS benchmark
Requirements_Attributes_DefaultValue: Optional[str] = (
None # TODO Optional for now since it's not present in the CIS 1.5, 2.0 and 3.0 AWS benchmark
)
Requirements_Attributes_References: str
Status: str
StatusExtended: str
@@ -50,7 +50,7 @@ class AzureCISModel(BaseModel):
Requirements_Id: str
Requirements_Description: str
Requirements_Attributes_Section: str
Requirements_Attributes_SubSection: Optional[str]
Requirements_Attributes_SubSection: Optional[str] = None
Requirements_Attributes_Profile: str
Requirements_Attributes_AssessmentStatus: str
Requirements_Attributes_Description: str
@@ -82,7 +82,7 @@ class M365CISModel(BaseModel):
Requirements_Id: str
Requirements_Description: str
Requirements_Attributes_Section: str
Requirements_Attributes_SubSection: Optional[str]
Requirements_Attributes_SubSection: Optional[str] = None
Requirements_Attributes_Profile: str
Requirements_Attributes_AssessmentStatus: str
Requirements_Attributes_Description: str
@@ -114,7 +114,7 @@ class GCPCISModel(BaseModel):
Requirements_Id: str
Requirements_Description: str
Requirements_Attributes_Section: str
Requirements_Attributes_SubSection: Optional[str]
Requirements_Attributes_SubSection: Optional[str] = None
Requirements_Attributes_Profile: str
Requirements_Attributes_AssessmentStatus: str
Requirements_Attributes_Description: str
@@ -145,8 +145,8 @@ class KubernetesCISModel(BaseModel):
Requirements_Id: str
Requirements_Description: str
Requirements_Attributes_Section: str
Requirements_Attributes_SubSection: Optional[str]
Requirements_Attributes_Profile: str
Requirements_Attributes_SubSection: Optional[str] = None
Requirements_Attributes_Profile: Optional[str] = None
Requirements_Attributes_AssessmentStatus: str
Requirements_Attributes_Description: str
Requirements_Attributes_RationaleStatement: str
+1 -1
View File
@@ -1,4 +1,4 @@
from pydantic import BaseModel
from pydantic.v1 import BaseModel
class AWSENSModel(BaseModel):
@@ -1,6 +1,6 @@
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
class GenericComplianceModel(BaseModel):
@@ -15,11 +15,11 @@ class GenericComplianceModel(BaseModel):
AssessmentDate: str
Requirements_Id: str
Requirements_Description: str
Requirements_Attributes_Section: Optional[str]
Requirements_Attributes_SubSection: Optional[str]
Requirements_Attributes_SubGroup: Optional[str]
Requirements_Attributes_Service: Optional[str]
Requirements_Attributes_Type: Optional[str]
Requirements_Attributes_Section: Optional[str] = None
Requirements_Attributes_SubSection: Optional[str] = None
Requirements_Attributes_SubGroup: Optional[str] = None
Requirements_Attributes_Service: Optional[str] = None
Requirements_Attributes_Type: Optional[str] = None
Status: str
StatusExtended: str
ResourceId: str
@@ -1,4 +1,4 @@
from pydantic import BaseModel
from pydantic.v1 import BaseModel
class AWSISO27001Model(BaseModel):
@@ -1,6 +1,6 @@
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
class AWSKISAISMSPModel(BaseModel):
@@ -19,10 +19,10 @@ class AWSKISAISMSPModel(BaseModel):
Requirements_Attributes_Domain: str
Requirements_Attributes_Subdomain: str
Requirements_Attributes_Section: str
Requirements_Attributes_AuditChecklist: Optional[list[str]]
Requirements_Attributes_RelatedRegulations: Optional[list[str]]
Requirements_Attributes_AuditEvidence: Optional[list[str]]
Requirements_Attributes_NonComplianceCases: Optional[list[str]]
Requirements_Attributes_AuditChecklist: Optional[list[str]] = None
Requirements_Attributes_RelatedRegulations: Optional[list[str]] = None
Requirements_Attributes_AuditEvidence: Optional[list[str]] = None
Requirements_Attributes_NonComplianceCases: Optional[list[str]] = None
Status: str
StatusExtended: str
ResourceId: str
@@ -1,4 +1,4 @@
from pydantic import BaseModel
from pydantic.v1 import BaseModel
class AWSMitreAttackModel(BaseModel):
@@ -1,6 +1,6 @@
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
class ProwlerThreatScoreAWSModel(BaseModel):
@@ -17,7 +17,7 @@ class ProwlerThreatScoreAWSModel(BaseModel):
Requirements_Description: str
Requirements_Attributes_Title: str
Requirements_Attributes_Section: str
Requirements_Attributes_SubSection: Optional[str]
Requirements_Attributes_SubSection: Optional[str] = None
Requirements_Attributes_AttributeDescription: str
Requirements_Attributes_AdditionalInformation: str
Requirements_Attributes_LevelOfRisk: int
@@ -44,7 +44,7 @@ class ProwlerThreatScoreAzureModel(BaseModel):
Requirements_Description: str
Requirements_Attributes_Title: str
Requirements_Attributes_Section: str
Requirements_Attributes_SubSection: Optional[str]
Requirements_Attributes_SubSection: Optional[str] = None
Requirements_Attributes_AttributeDescription: str
Requirements_Attributes_AdditionalInformation: str
Requirements_Attributes_LevelOfRisk: int
@@ -71,7 +71,7 @@ class ProwlerThreatScoreGCPModel(BaseModel):
Requirements_Description: str
Requirements_Attributes_Title: str
Requirements_Attributes_Section: str
Requirements_Attributes_SubSection: Optional[str]
Requirements_Attributes_SubSection: Optional[str] = None
Requirements_Attributes_AttributeDescription: str
Requirements_Attributes_AdditionalInformation: str
Requirements_Attributes_LevelOfRisk: int
@@ -98,7 +98,7 @@ class ProwlerThreatScoreM365Model(BaseModel):
Requirements_Description: str
Requirements_Attributes_Title: str
Requirements_Attributes_Section: str
Requirements_Attributes_SubSection: Optional[str]
Requirements_Attributes_SubSection: Optional[str] = None
Requirements_Attributes_AttributeDescription: str
Requirements_Attributes_AdditionalInformation: str
Requirements_Attributes_LevelOfRisk: int
+15 -3
View File
@@ -3,7 +3,7 @@ from datetime import datetime
from types import SimpleNamespace
from typing import Optional, Union
from pydantic import BaseModel, Field, ValidationError
from pydantic.v1 import BaseModel, Field, ValidationError
from prowler.config.config import prowler_version
from prowler.lib.check.models import (
@@ -38,7 +38,7 @@ class Finding(BaseModel):
account_organization_uid: Optional[str] = None
account_organization_name: Optional[str] = None
metadata: CheckMetadata
account_tags: dict = {}
account_tags: dict = Field(default_factory=dict)
uid: str
status: Status
status_extended: str
@@ -50,7 +50,7 @@ class Finding(BaseModel):
resource_tags: dict = Field(default_factory=dict)
partition: Optional[str] = None
region: str
compliance: dict
compliance: dict = Field(default_factory=dict)
prowler_version: str = prowler_version
raw: dict = Field(default_factory=dict)
@@ -282,6 +282,18 @@ class Finding(BaseModel):
output_data["resource_uid"] = check_output.resource_id
output_data["region"] = check_output.location
elif provider.type == "iac":
output_data["auth_method"] = "local" # Until we support remote repos
output_data["account_uid"] = "iac"
output_data["account_name"] = "iac"
output_data["resource_name"] = check_output.resource["resource"]
output_data["resource_uid"] = check_output.resource["resource"]
output_data["region"] = check_output.resource_path
output_data["resource_line_range"] = check_output.resource_line_range
output_data["framework"] = (
check_output.check_metadata.ServiceName
) # TODO: can we get the framework from the check_output?
# check_output Unique ID
# TODO: move this to a function
# TODO: in Azure, GCP and K8s there are findings without resource_name
+47 -2
View File
@@ -41,7 +41,7 @@ class HTML(Output):
<td>{finding_status}</td>
<td>{finding.metadata.Severity.value}</td>
<td>{finding.metadata.ServiceName}</td>
<td>{finding.region.lower()}</td>
<td>{":".join([finding.resource_metadata['file_path'], "-".join(map(str, finding.resource_metadata['file_line_range']))]) if finding.metadata.Provider == "iac" else finding.region.lower()}</td>
<td>{finding.metadata.CheckID.replace("_", "<wbr />_")}</td>
<td>{finding.metadata.CheckTitle}</td>
<td>{finding.resource_uid.replace("<", "&lt;").replace(">", "&gt;").replace("_", "<wbr />_")}</td>
@@ -204,7 +204,7 @@ class HTML(Output):
<th scope="col">Status</th>
<th scope="col">Severity</th>
<th scope="col">Service Name</th>
<th scope="col">Region</th>
<th scope="col">{"File" if provider.type == "iac" else "Region"}</th>
<th style="width:20%" scope="col">Check ID</th>
<th style="width:20%" scope="col">Check Title</th>
<th scope="col">Resource ID</th>
@@ -689,6 +689,51 @@ class HTML(Output):
)
return ""
@staticmethod
def get_iac_assessment_summary(provider: Provider) -> str:
"""
get_iac_assessment_summary gets the HTML assessment summary for the provider
Args:
provider (Provider): the provider object
Returns:
str: the HTML assessment summary
"""
try:
return f"""
<div class="col-md-2">
<div class="card">
<div class="card-header">
IAC Assessment Summary
</div>
<ul class="list-group
list-group-flush">
<li class="list-group-item">
<b>IAC path:</b> {provider.scan_path}
</li>
</ul>
</div>
</div>
<div class="col-md-4">
<div class="card">
<div class="card-header">
IAC Credentials
</div>
<ul class="list-group
list-group-flush">
<li class="list-group-item">
<b>IAC authentication method:</b> local
</li>
</ul>
</div>
</div>"""
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
return ""
@staticmethod
def get_assessment_summary(provider: Provider) -> str:
"""
+2 -2
View File
@@ -3,9 +3,9 @@ from datetime import datetime
from typing import List
from py_ocsf_models.events.base_event import SeverityID, StatusID
from py_ocsf_models.events.findings.detection_finding import DetectionFinding
from py_ocsf_models.events.findings.detection_finding import (
TypeID as DetectionFindingTypeID,
DetectionFinding,
DetectionFindingTypeID,
)
from py_ocsf_models.events.findings.finding import ActivityID, FindingInformation
from py_ocsf_models.objects.account import Account, TypeID
+3
View File
@@ -54,6 +54,9 @@ def display_summary_table(
elif provider.type == "nhn":
entity_type = "Tenant Domain"
audited_entities = provider.identity.tenant_domain
elif provider.type == "iac":
entity_type = "Directory"
audited_entities = provider.scan_path
# Check if there are findings and that they are not all MANUAL
if findings and not all(finding.status == "MANUAL" for finding in findings):
+2 -2
View File
@@ -1,7 +1,7 @@
import os
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.providers.aws.exceptions.exceptions import AWSIAMRoleARNMissingFieldsError
@@ -10,7 +10,7 @@ class ARN(BaseModel):
arn: str
partition: str
service: str
region: Optional[str] # In IAM ARN's do not have region
region: Optional[str] = None # In IAM ARN's do not have region
account_id: str
resource: str
resource_type: str
@@ -1,7 +1,7 @@
from typing import Optional
from botocore.exceptions import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -2,7 +2,7 @@ from typing import Optional
from venv import logger
from botocore.client import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.providers.aws.lib.service.service import AWSService
@@ -101,6 +101,6 @@ class Account(AWSService):
class Contact(BaseModel):
type: str
email: Optional[str]
name: Optional[str]
phone_number: Optional[str]
email: Optional[str] = None
name: Optional[str] = None
phone_number: Optional[str] = None
@@ -1,7 +1,7 @@
from datetime import datetime
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -111,5 +111,5 @@ class Certificate(BaseModel):
tags: Optional[list] = []
expiration_days: int
in_use: bool
transparency_logging: Optional[bool]
transparency_logging: Optional[bool] = None
region: str
@@ -1,7 +1,7 @@
from typing import Optional
from botocore.exceptions import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -224,11 +224,11 @@ class Stage(BaseModel):
arn: str
logging: bool
client_certificate: bool
waf: Optional[str]
waf: Optional[str] = None
tags: Optional[list] = []
tracing_enabled: Optional[bool]
cache_enabled: Optional[bool]
cache_data_encrypted: Optional[bool]
tracing_enabled: Optional[bool] = None
cache_enabled: Optional[bool] = None
cache_data_encrypted: Optional[bool] = None
class PathResourceMethods(BaseModel):
@@ -1,7 +1,7 @@
from typing import Optional
from botocore.exceptions import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,6 +1,6 @@
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,6 +1,6 @@
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,6 +1,6 @@
from typing import Optional
from pydantic import BaseModel, Field
from pydantic.v1 import BaseModel, Field
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,4 +1,4 @@
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -7,7 +7,7 @@ from typing import Any, Optional
import requests
from botocore.client import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -196,12 +196,12 @@ class Function(BaseModel):
name: str
arn: str
security_groups: list
runtime: Optional[str]
runtime: Optional[str] = None
environment: dict = None
region: str
policy: dict = None
policy: dict = {}
code: LambdaCode = None
url_config: URLConfig = None
vpc_id: Optional[str]
subnet_ids: Optional[set]
vpc_id: Optional[str] = None
subnet_ids: Optional[set] = None
tags: Optional[list] = []
@@ -2,7 +2,7 @@ from datetime import datetime
from typing import Optional
from botocore.client import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -227,7 +227,7 @@ class BackupVault(BaseModel):
locked: bool
min_retention_days: int = None
max_retention_days: int = None
tags: Optional[list]
tags: Optional[list] = None
class BackupPlan(BaseModel):
@@ -236,17 +236,17 @@ class BackupPlan(BaseModel):
region: str
name: str
version_id: str
last_execution_date: Optional[datetime]
last_execution_date: Optional[datetime] = None
advanced_settings: list
tags: Optional[list]
tags: Optional[list] = None
class BackupReportPlan(BaseModel):
arn: str
region: str
name: str
last_attempted_execution_date: Optional[datetime]
last_successful_execution_date: Optional[datetime]
last_attempted_execution_date: Optional[datetime] = None
last_successful_execution_date: Optional[datetime] = None
class RecoveryPoint(BaseModel):
@@ -256,4 +256,4 @@ class RecoveryPoint(BaseModel):
backup_vault_name: str
encrypted: bool
backup_vault_region: str
tags: Optional[list]
tags: Optional[list] = None
@@ -1,6 +1,6 @@
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -116,7 +116,7 @@ class Guardrail(BaseModel):
region: str
tags: Optional[list] = []
sensitive_information_filter: bool = False
prompt_attack_filter_strength: Optional[str]
prompt_attack_filter_strength: Optional[str] = None
class BedrockAgent(AWSService):
@@ -169,6 +169,6 @@ class Agent(BaseModel):
id: str
name: str
arn: str
guardrail_id: Optional[str]
guardrail_id: Optional[str] = None
region: str
tags: Optional[list] = []
@@ -1,7 +1,7 @@
from typing import Optional
from botocore.client import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,7 +1,7 @@
from enum import Enum
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -186,7 +186,7 @@ class SSLSupportMethod(Enum):
class DefaultCacheConfigBehaviour(BaseModel):
realtime_log_config_arn: Optional[str]
realtime_log_config_arn: Optional[str] = None
viewer_protocol_policy: ViewerProtocolPolicy
field_level_encryption_id: str
@@ -196,8 +196,8 @@ class Origin(BaseModel):
domain_name: str
origin_protocol_policy: str
origin_ssl_protocols: list[str]
origin_access_control: Optional[str]
s3_origin_config: Optional[dict]
origin_access_control: Optional[str] = None
s3_origin_config: Optional[dict] = None
class Distribution(BaseModel):
@@ -207,14 +207,14 @@ class Distribution(BaseModel):
id: str
region: str
logging_enabled: bool = False
default_cache_config: Optional[DefaultCacheConfigBehaviour]
geo_restriction_type: Optional[GeoRestrictionType]
default_cache_config: Optional[DefaultCacheConfigBehaviour] = None
geo_restriction_type: Optional[GeoRestrictionType] = None
origins: list[Origin]
web_acl_id: str = ""
default_certificate: Optional[bool]
default_root_object: Optional[str]
viewer_protocol_policy: Optional[str]
default_certificate: Optional[bool] = None
default_root_object: Optional[str] = None
viewer_protocol_policy: Optional[str] = None
tags: Optional[list] = []
origin_failover: Optional[bool]
ssl_support_method: Optional[SSLSupportMethod]
certificate: Optional[str]
origin_failover: Optional[bool] = None
ssl_support_method: Optional[SSLSupportMethod] = None
certificate: Optional[str] = None
@@ -2,7 +2,7 @@ from datetime import datetime, timedelta
from typing import Optional
from botocore.client import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -3,7 +3,7 @@ from datetime import datetime, timezone
from typing import Optional
from botocore.exceptions import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -278,8 +278,8 @@ class Logs(AWSService):
class MetricAlarm(BaseModel):
arn: str
name: str
metric: Optional[str]
name_space: Optional[str]
metric: Optional[str] = None
name_space: Optional[str] = None
region: str
tags: Optional[list] = []
alarm_actions: list
@@ -310,7 +310,7 @@ class MetricFilter(BaseModel):
name: str
metric: str
pattern: str
log_group: Optional[LogGroup]
log_group: Optional[LogGroup] = None
region: str
@@ -2,7 +2,7 @@ from enum import Enum
from typing import Optional
from botocore.exceptions import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -246,7 +246,7 @@ class Package(BaseModel):
"""Details of a package"""
name: str
namespace: Optional[str]
namespace: Optional[str] = None
format: str
origin_configuration: OriginConfiguration
latest_version: LatestPackageVersion
@@ -1,7 +1,7 @@
import datetime
from typing import List, Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -211,15 +211,15 @@ class Project(BaseModel):
name: str
arn: str
region: str
last_build: Optional[Build]
last_invoked_time: Optional[datetime.datetime]
buildspec: Optional[str]
source: Optional[Source]
last_build: Optional[Build] = None
last_invoked_time: Optional[datetime.datetime] = None
buildspec: Optional[str] = None
source: Optional[Source] = None
secondary_sources: Optional[list[Source]] = []
environment_variables: Optional[List[EnvironmentVariable]]
s3_logs: Optional[s3Logs]
cloudwatch_logs: Optional[CloudWatchLogs]
tags: Optional[list]
environment_variables: Optional[List[EnvironmentVariable]] = []
s3_logs: Optional[s3Logs] = None
cloudwatch_logs: Optional[CloudWatchLogs] = None
tags: Optional[list] = []
class ExportConfig(BaseModel):
@@ -233,6 +233,6 @@ class ReportGroup(BaseModel):
arn: str
name: str
region: str
status: Optional[str]
export_config: Optional[ExportConfig]
tags: Optional[list]
status: Optional[str] = None
export_config: Optional[ExportConfig] = None
tags: Optional[list] = []
@@ -1,7 +1,7 @@
from datetime import datetime
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,6 +1,6 @@
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,7 +1,7 @@
from typing import Dict, List, Optional
from botocore.exceptions import ClientError
from pydantic import BaseModel, Field
from pydantic.v1 import BaseModel, Field
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,7 +1,7 @@
from typing import Optional
from botocore.exceptions import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -3,7 +3,7 @@ from enum import Enum
from typing import Optional, Union
from botocore.client import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,4 +1,4 @@
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.providers.aws.lib.service.service import AWSService
@@ -1,7 +1,7 @@
import json
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,7 +1,7 @@
from typing import Optional
from botocore.client import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,5 +1,5 @@
from botocore.client import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -2,7 +2,7 @@ import json
from typing import Optional
from botocore.client import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -3,7 +3,7 @@ from ipaddress import IPv4Address, IPv6Address, ip_address
from typing import Optional, Union
from botocore.client import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -3,7 +3,7 @@ from json import loads
from typing import Optional
from botocore.exceptions import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,7 +1,7 @@
from re import sub
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -2,7 +2,7 @@ import json
from typing import Optional
from botocore.client import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,6 +1,6 @@
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,6 +1,6 @@
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,6 +1,6 @@
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,6 +1,6 @@
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,7 +1,7 @@
from typing import Dict, Optional
from botocore.client import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -2,7 +2,7 @@ from enum import Enum
from typing import Optional
from botocore.client import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -2,7 +2,7 @@ import json
from typing import Optional
from botocore.exceptions import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -2,7 +2,7 @@ from enum import Enum
from typing import Dict, List, Optional
from botocore.client import ClientError
from pydantic import BaseModel, Field
from pydantic.v1 import BaseModel, Field
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,5 +1,5 @@
from botocore.client import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,6 +1,6 @@
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -2,7 +2,7 @@ import json
from typing import Optional
from botocore.client import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,6 +1,6 @@
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -2,7 +2,7 @@ import json
from typing import Dict, List, Optional
from botocore.exceptions import ClientError
from pydantic import BaseModel, Field
from pydantic.v1 import BaseModel, Field
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,6 +1,6 @@
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -3,7 +3,7 @@ from datetime import datetime
from typing import Optional
from botocore.client import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.config.config import encoding_format_utf_8
from prowler.lib.logger import logger
@@ -1,4 +1,4 @@
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.providers.aws.lib.service.service import AWSService
@@ -1,4 +1,4 @@
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,7 +1,7 @@
from enum import Enum
from typing import Dict, List, Optional
from pydantic import BaseModel, Field
from pydantic.v1 import BaseModel, Field
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,7 +1,7 @@
import json
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,6 +1,6 @@
from typing import Dict, List
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,4 +1,4 @@
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.providers.aws.lib.service.service import AWSService
@@ -1,4 +1,4 @@
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,7 +1,7 @@
from enum import Enum
from typing import Dict, List
from pydantic import BaseModel, Field
from pydantic.v1 import BaseModel, Field
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,7 +1,7 @@
from typing import Optional
from botocore.client import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,7 +1,7 @@
from enum import Enum
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -1,7 +1,7 @@
from json import JSONDecodeError, loads
from typing import Optional
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -2,7 +2,7 @@ import json
from typing import Optional
from botocore.client import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
@@ -2,7 +2,7 @@ from datetime import datetime
from typing import Optional
from botocore.client import ClientError
from pydantic import BaseModel
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered

Some files were not shown because too many files have changed in this diff Show More