Compare commits

..

61 Commits
5.2.3 ... 5.3.0

Author SHA1 Message Date
Mario Rodriguez Lopez
45d359c84a feat(microsof365): Add documentation and compliance file (#6195)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-02-10 17:18:43 +01:00
Pablo Lara
6049e5d4e8 chore: Update prowler api version (#6877)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-02-10 17:39:08 +05:45
Víctor Fernández Poyatos
dfd377f89e chore(api): Update changelog and specs (#6876) 2025-02-10 12:25:04 +01:00
Víctor Fernández Poyatos
37e6c52c14 chore: Add needed steps for API in PR template (#6875) 2025-02-10 12:24:51 +01:00
Pepe Fagoaga
d6a7f4d88f fix(kubernetes): Change UID validation (#6869)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-10 11:24:34 +01:00
Pepe Fagoaga
239cda0a90 chore: Rename dashboard table latest findings (#6873)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-02-10 11:24:27 +01:00
dependabot[bot]
4a821e425b chore(deps-dev): bump mkdocs-material from 9.6.2 to 9.6.3 (#6871)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:24:21 +01:00
Sergio Garcia
e1a2f0c204 docs(eks): add documentation about EKS onboarding (#6853)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-02-10 11:24:16 +01:00
Prowler Bot
c70860c733 chore(regions_update): Changes in regions for AWS services (#6858)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-10 11:24:09 +01:00
Víctor Fernández Poyatos
05e71e033f feat(findings): Use ArrayAgg and subqueries on metadata endpoint (#6863)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-02-10 11:24:03 +01:00
Pablo Lara
5164ec2eb9 feat: implement new functionality with inserted_at__gte in findings a… (#6864) 2025-02-10 11:23:58 +01:00
Víctor Fernández Poyatos
be18dac4f9 docs: Add details about user creation in Prowler app (#6862) 2025-02-10 11:23:52 +01:00
dependabot[bot]
bb126c242f chore(deps): bump microsoft-kiota-abstractions from 1.9.1 to 1.9.2 (#6856)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:23:47 +01:00
Víctor Fernández Poyatos
e27780a856 feat(findings): Require date filters for findings endpoints (#6800) 2025-02-10 11:23:41 +01:00
Pranay Girase
196ec51751 fix(typo): typos in Dashboard and Report in HTML (#6847) 2025-02-10 11:23:33 +01:00
Prowler Bot
86abf9e64c chore(regions_update): Changes in regions for AWS services (#6848)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-10 11:23:28 +01:00
dependabot[bot]
9d8be578e3 chore(deps): bump trufflesecurity/trufflehog from 3.88.4 to 3.88.5 (#6844)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:23:19 +01:00
Mario Rodriguez Lopez
b3aa800082 feat(entra): add new check entra_thirdparty_integrated_apps_not_allowed (#6357)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-10 11:23:13 +01:00
Ogonna Iwunze
501674a778 feat(kms): add kms_cmk_not_multi_region AWS check (#6794)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-10 11:23:01 +01:00
Prowler Bot
5ff6ae79d8 chore(regions_update): Changes in regions for AWS services (#6821)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-10 11:21:08 +01:00
Mario Rodriguez Lopez
e518a869ab feat(entra): add new entra service for Microsoft365 (#6326)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-10 11:20:58 +01:00
Mario Rodriguez Lopez
43927a62f3 feat(microsoft365): add new check admincenter_settings_password_never_expire (#6023)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-10 11:20:52 +01:00
dependabot[bot]
335980c8d8 chore(deps): bump kubernetes from 31.0.0 to 32.0.0 (#6678)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:20:42 +01:00
Pablo Lara
ca3ee378db style(forms): improve spacing consistency (#6814) 2025-02-10 11:17:11 +01:00
Pablo Lara
c05bc1068a chore(forms): improvements to the sign-in and sign-up forms (#6813) 2025-02-10 11:17:03 +01:00
Drew Kerrigan
2e3164636d docs(): add description of changed and new delta values to prowler app tutorial (#6801) 2025-02-10 11:16:56 +01:00
dependabot[bot]
c34e07fc40 chore(deps): bump pytz from 2024.2 to 2025.1 (#6765)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:16:50 +01:00
dependabot[bot]
6022122a61 chore(deps-dev): bump mkdocs-material from 9.5.50 to 9.6.2 (#6799)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:16:44 +01:00
dependabot[bot]
f65f5e4b46 chore(deps-dev): bump pylint from 3.3.3 to 3.3.4 (#6721)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:16:37 +01:00
Pablo Lara
dee17733a0 feat(scans): show scan details right after launch (#6791) 2025-02-10 11:16:30 +01:00
dependabot[bot]
cddda1e64e chore(deps): bump trufflesecurity/trufflehog from 3.88.2 to 3.88.4 (#6760)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:16:23 +01:00
dependabot[bot]
f7b873db03 chore(deps): bump google-api-python-client from 2.159.0 to 2.160.0 (#6720)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:16:17 +01:00
Víctor Fernández Poyatos
792bc70d0a feat(schedules): Rework daily schedule to always show the next scan (#6700) 2025-02-10 11:16:11 +01:00
Hugo Pereira Brito
185491b061 fix: microsoft365 mutelist (#6724) 2025-02-10 11:16:05 +01:00
dependabot[bot]
3af8a43480 chore(deps): bump microsoft-kiota-abstractions from 1.6.8 to 1.9.1 (#6734)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:15:57 +01:00
Pablo Lara
fd78406b29 fix: Enable hot reloading when using Docker Compose for UI (#6750) 2025-02-10 11:15:49 +01:00
Matt Johnson
4758b258a3 chore: Update Google Analytics ID across all docs.prowler.com sites. (#6730) 2025-02-10 11:15:41 +01:00
dependabot[bot]
015e2b3b88 chore(deps): bump uuid from 10.0.0 to 11.0.5 in /ui (#6516)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:15:33 +01:00
Mario Rodriguez Lopez
e184c9cb61 feat(m365): add Microsoft 365 provider (#5902)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-10 11:15:24 +01:00
dependabot[bot]
9004a01183 chore(deps): bump azure-mgmt-web from 7.3.1 to 8.0.0 (#6680)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:15:18 +01:00
dependabot[bot]
dd65ba3d9e chore(deps): bump msgraph-sdk from 1.17.0 to 1.18.0 (#6679)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:15:08 +01:00
dependabot[bot]
bba616a18f chore(deps): bump azure-storage-blob from 12.24.0 to 12.24.1 (#6666)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:14:33 +01:00
Pepe Fagoaga
aa0f8d2981 chore: bump for next minor (#6672) 2025-02-10 11:13:42 +01:00
Paolo Frigo
2511d6ffa9 docs: update # of checks, services, frameworks and categories (#6528)
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-10 11:12:02 +01:00
Prowler Bot
27329457be fix(dashboard): adjust the bar chart display (#6868)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-02-07 11:04:23 -05:00
Prowler Bot
7189f3d526 fix(aws): key error for detect-secrets (#6865)
Co-authored-by: Kay Agahd <kagahd@users.noreply.github.com>
2025-02-07 10:04:54 -05:00
Prowler Bot
58e7589c9d fix(kms): handle error in DescribeKey function (#6842)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-05 15:27:43 -05:00
Prowler Bot
d60f4b5ded fix(cloudfront): fix false positive in s3 origins (#6838)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-02-05 13:36:49 -05:00
Prowler Bot
4c2ec094f6 fix(findings): Spelling mistakes correction (#6834)
Co-authored-by: Gary Mclean <gary.mclean@krrv.io>
2025-02-05 11:53:17 -05:00
Prowler Bot
395ecaff5b fix(directoryservice): handle ClientException (#6828)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-02-05 11:20:13 -05:00
Prowler Bot
c39506ef7d fix(aws) wording of report.status_extended in awslambda_function_not_publicly_accessible (#6831)
Co-authored-by: Kay Agahd <kagahd@users.noreply.github.com>
2025-02-05 11:18:27 -05:00
Prowler Bot
eb90d479e2 chore(aws_audit_manager_control_tower_guardrails): add checks to reqs (#6803)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-04 19:47:04 -05:00
Prowler Bot
b92a73f5ea fix(elasticache): InvalidReplicationGroupStateFault error (#6820)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-02-04 16:08:26 -05:00
Prowler Bot
ad121f3059 chore(deps-dev): bump moto from 5.0.27 to 5.0.28 (#6817)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-04 14:25:04 -05:00
Pepe Fagoaga
70e4c5a44e chore: bump for next patch (#6764) 2025-02-03 15:25:23 -05:00
Prowler Bot
b5a46b7b59 fix(cis_1.5_aws): add checks to needed reqs (#6798)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-02-03 11:37:53 -05:00
Prowler Bot
f1a97cd166 fix(cis_1.4_aws): add checks to needed reqs (#6796)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-02-03 11:37:39 -05:00
Prowler Bot
0774508093 fix(cis_2.0_aws): add checks to needed reqs (#6787)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-02-03 11:37:24 -05:00
Prowler Bot
0664ce6b94 fix(gcp): fix wrong provider value in check (#6789)
Co-authored-by: secretcod3r <101349794+secretcod3r@users.noreply.github.com>
2025-02-03 10:32:53 -05:00
Prowler Bot
407c779c52 fix(findings): remove default status filtering (#6785)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-02-03 15:25:11 +01:00
Prowler Bot
c60f13f23f fix(findings): order findings by inserted_at DESC (#6783)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-02-03 11:57:45 +01:00
143 changed files with 5486 additions and 790 deletions

View File

@@ -17,6 +17,11 @@ Please include a summary of the change and which issue is fixed. List any depend
- [ ] Review if backport is needed.
- [ ] Review if is needed to change the [Readme.md](https://github.com/prowler-cloud/prowler/blob/master/README.md)
#### API
- [ ] Verify if API specs need to be regenerated.
- [ ] Check if version updates are required (e.g., specs, Poetry, etc.).
- [ ] Ensure new entries are added to [CHANGELOG.md](https://github.com/prowler-cloud/prowler/blob/master/api/CHANGELOG.md), if applicable.
### License
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

View File

@@ -11,7 +11,7 @@ jobs:
with:
fetch-depth: 0
- name: TruffleHog OSS
uses: trufflesecurity/trufflehog@v3.88.2
uses: trufflesecurity/trufflehog@v3.88.5
with:
path: ./
base: ${{ github.event.repository.default_branch }}

View File

@@ -71,10 +71,13 @@ It contains hundreds of controls covering CIS, NIST 800, NIST CSF, CISA, RBI, Fe
| Provider | Checks | Services | [Compliance Frameworks](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/compliance/) | [Categories](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/misc/#categories) |
|---|---|---|---|---|
| AWS | 561 | 81 -> `prowler aws --list-services` | 30 -> `prowler aws --list-compliance` | 9 -> `prowler aws --list-categories` |
| GCP | 77 | 13 -> `prowler gcp --list-services` | 4 -> `prowler gcp --list-compliance` | 2 -> `prowler gcp --list-categories`|
| Azure | 139 | 18 -> `prowler azure --list-services` | 5 -> `prowler azure --list-compliance` | 2 -> `prowler azure --list-categories` |
| Kubernetes | 83 | 7 -> `prowler kubernetes --list-services` | 2 -> `prowler kubernetes --list-compliance` | 7 -> `prowler kubernetes --list-categories` |
| AWS | 564 | 82 | 30 | 10 |
| GCP | 77 | 13 | 4 | 3 |
| Azure | 140 | 18 | 5 | 3 |
| Kubernetes | 83 | 7 | 2 | 7 |
| Microsoft365 | 5 | 2 | 1 | 0 |
> You can list the checks, services, compliance frameworks and categories with `prowler <provider> --list-checks`, `prowler <provider> --list-services`, `prowler <provider> --list-compliance` and `prowler <provider> --list-categories`.
# 💻 Installation

20
api/CHANGELOG.md Normal file
View File

@@ -0,0 +1,20 @@
# Prowler API Changelog
All notable changes to the **Prowler API** are documented in this file.
---
## [Unreleased]
---
## [v1.4.0] (Prowler v5.3.0) - 2025-02-10
### Changed
- Daily scheduled scan instances are now created beforehand with `SCHEDULED` state [(#6700)](https://github.com/prowler-cloud/prowler/pull/6700).
- Findings endpoints now require at least one date filter [(#6800)](https://github.com/prowler-cloud/prowler/pull/6800).
- Findings metadata endpoint received a performance improvement [(#6863)](https://github.com/prowler-cloud/prowler/pull/6863).
- Increase the allowed length of the provider UID for Kubernetes providers [(#6869)](https://github.com/prowler-cloud/prowler/pull/6869).
---

243
api/poetry.lock generated
View File

@@ -620,13 +620,13 @@ msrest = ">=0.7.1"
[[package]]
name = "azure-mgmt-web"
version = "7.3.1"
version = "8.0.0"
description = "Microsoft Azure Web Apps Management Client Library for Python"
optional = false
python-versions = ">=3.8"
files = [
{file = "azure-mgmt-web-7.3.1.tar.gz", hash = "sha256:87b771436bc99a7a8df59d0ad185b96879a06dce14764a06b3fc3dafa8fcb56b"},
{file = "azure_mgmt_web-7.3.1-py3-none-any.whl", hash = "sha256:ccf881e3ab31c3fdbf9cbff32773d9c0006b5dcd621ea074d7ec89e51049fb72"},
{file = "azure_mgmt_web-8.0.0-py3-none-any.whl", hash = "sha256:0536aac05bfc673b56ed930f2966b77856e84df675d376e782a7af6bb92449af"},
{file = "azure_mgmt_web-8.0.0.tar.gz", hash = "sha256:c8d9c042c09db7aacb20270a9effed4d4e651e365af32d80897b84dc7bf35098"},
]
[package.dependencies]
@@ -637,13 +637,13 @@ typing-extensions = ">=4.6.0"
[[package]]
name = "azure-storage-blob"
version = "12.24.0"
version = "12.24.1"
description = "Microsoft Azure Blob Storage Client Library for Python"
optional = false
python-versions = ">=3.8"
files = [
{file = "azure_storage_blob-12.24.0-py3-none-any.whl", hash = "sha256:4f0bb4592ea79a2d986063696514c781c9e62be240f09f6397986e01755bc071"},
{file = "azure_storage_blob-12.24.0.tar.gz", hash = "sha256:eaaaa1507c8c363d6e1d1342bd549938fdf1adec9b1ada8658c8f5bf3aea844e"},
{file = "azure_storage_blob-12.24.1-py3-none-any.whl", hash = "sha256:77fb823fdbac7f3c11f7d86a5892e2f85e161e8440a7489babe2195bf248f09e"},
{file = "azure_storage_blob-12.24.1.tar.gz", hash = "sha256:052b2a1ea41725ba12e2f4f17be85a54df1129e13ea0321f5a2fcc851cbf47d4"},
]
[package.dependencies]
@@ -1943,13 +1943,13 @@ grpcio-gcp = ["grpcio-gcp (>=0.2.2,<1.0.dev0)"]
[[package]]
name = "google-api-python-client"
version = "2.159.0"
version = "2.160.0"
description = "Google API Client Library for Python"
optional = false
python-versions = ">=3.7"
files = [
{file = "google_api_python_client-2.159.0-py2.py3-none-any.whl", hash = "sha256:baef0bb631a60a0bd7c0bf12a5499e3a40cd4388484de7ee55c1950bf820a0cf"},
{file = "google_api_python_client-2.159.0.tar.gz", hash = "sha256:55197f430f25c907394b44fa078545ffef89d33fd4dca501b7db9f0d8e224bd6"},
{file = "google_api_python_client-2.160.0-py2.py3-none-any.whl", hash = "sha256:63d61fb3e4cf3fb31a70a87f45567c22f6dfe87bbfa27252317e3e2c42900db4"},
{file = "google_api_python_client-2.160.0.tar.gz", hash = "sha256:a8ccafaecfa42d15d5b5c3134ced8de08380019717fc9fb1ed510ca58eca3b7e"},
]
[package.dependencies]
@@ -2361,13 +2361,13 @@ zookeeper = ["kazoo (>=2.8.0)"]
[[package]]
name = "kubernetes"
version = "31.0.0"
version = "32.0.0"
description = "Kubernetes python client"
optional = false
python-versions = ">=3.6"
files = [
{file = "kubernetes-31.0.0-py2.py3-none-any.whl", hash = "sha256:bf141e2d380c8520eada8b351f4e319ffee9636328c137aa432bc486ca1200e1"},
{file = "kubernetes-31.0.0.tar.gz", hash = "sha256:28945de906c8c259c1ebe62703b56a03b714049372196f854105afe4e6d014c0"},
{file = "kubernetes-32.0.0-py2.py3-none-any.whl", hash = "sha256:60fd8c29e8e43d9c553ca4811895a687426717deba9c0a66fb2dcc3f5ef96692"},
{file = "kubernetes-32.0.0.tar.gz", hash = "sha256:319fa840345a482001ac5d6062222daeb66ec4d1bcb3087402aed685adf0aecb"},
]
[package.dependencies]
@@ -2523,13 +2523,13 @@ files = [
[[package]]
name = "microsoft-kiota-abstractions"
version = "1.6.8"
version = "1.9.2"
description = "Core abstractions for kiota generated libraries in Python"
optional = false
python-versions = "<4.0,>=3.8"
python-versions = "<4.0,>=3.9"
files = [
{file = "microsoft_kiota_abstractions-1.6.8-py3-none-any.whl", hash = "sha256:12819dee24d5aaa31e99683d938f65e50cbc446de087df244cd26c3326ec4e15"},
{file = "microsoft_kiota_abstractions-1.6.8.tar.gz", hash = "sha256:7070affabfa7182841646a0c8491cbb240af366aff2b9132f0caa45c4837dd78"},
{file = "microsoft_kiota_abstractions-1.9.2-py3-none-any.whl", hash = "sha256:a8853d272a84da59d6a2fe11a76c28e9c55bdab268a345ba48e918cb6822b607"},
{file = "microsoft_kiota_abstractions-1.9.2.tar.gz", hash = "sha256:29cdafe8d0672f23099556e0b120dca6231c752cca9393e1e0092fa9ca594572"},
]
[package.dependencies]
@@ -2539,99 +2539,94 @@ std-uritemplate = ">=2.0.0"
[[package]]
name = "microsoft-kiota-authentication-azure"
version = "1.6.8"
version = "1.9.2"
description = "Core abstractions for kiota generated libraries in Python"
optional = false
python-versions = "<4.0,>=3.8"
python-versions = "<4.0,>=3.9"
files = [
{file = "microsoft_kiota_authentication_azure-1.6.8-py3-none-any.whl", hash = "sha256:50455789b7133e27fbccec839d93e40d2637d18593a93921ae1338880c5b5b3b"},
{file = "microsoft_kiota_authentication_azure-1.6.8.tar.gz", hash = "sha256:fef23f43cd4d3b9ef839c8b3d1f675ec4a1120c150f963d8c4551c5e19ac3b36"},
{file = "microsoft_kiota_authentication_azure-1.9.2-py3-none-any.whl", hash = "sha256:56840f8b15df8aedfd143fb2deb7cc7fae4ac0bafb1a50546b7313a7b3ab4ca0"},
{file = "microsoft_kiota_authentication_azure-1.9.2.tar.gz", hash = "sha256:171045f522a93d9340fbddc4cabb218f14f1d9d289e82e535b3d9291986c3d5a"},
]
[package.dependencies]
aiohttp = ">=3.8.0"
azure-core = ">=1.21.1"
microsoft-kiota-abstractions = ">=1.6.8,<1.7.0"
microsoft-kiota-abstractions = ">=1.9.2,<1.10.0"
opentelemetry-api = ">=1.27.0"
opentelemetry-sdk = ">=1.27.0"
[[package]]
name = "microsoft-kiota-http"
version = "1.6.8"
version = "1.9.2"
description = "Core abstractions for kiota generated libraries in Python"
optional = false
python-versions = "<4.0,>=3.8"
python-versions = "<4.0,>=3.9"
files = [
{file = "microsoft_kiota_http-1.6.8-py3-none-any.whl", hash = "sha256:7ff76a308351d885453185d6a6538c47a64ebdc7661cce46a904e89e2ceb9a1d"},
{file = "microsoft_kiota_http-1.6.8.tar.gz", hash = "sha256:67242690b79a30c0cadf823675249269e4bc020283e3d65b33af7d771df64df8"},
{file = "microsoft_kiota_http-1.9.2-py3-none-any.whl", hash = "sha256:3a2d930a70d0184d9f4848473f929ee892462cae1acfaf33b2d193f1828c76c2"},
{file = "microsoft_kiota_http-1.9.2.tar.gz", hash = "sha256:2ba3d04a3d1d5d600736eebc1e33533d54d87799ac4fbb92c9cce4a97809af61"},
]
[package.dependencies]
httpx = {version = ">=0.28", extras = ["http2"]}
microsoft-kiota-abstractions = ">=1.6.8,<1.7.0"
httpx = {version = ">=0.25,<1.0.0", extras = ["http2"]}
microsoft-kiota-abstractions = ">=1.9.2,<1.10.0"
opentelemetry-api = ">=1.27.0"
opentelemetry-sdk = ">=1.27.0"
urllib3 = ">=2.2.2,<3.0.0"
[[package]]
name = "microsoft-kiota-serialization-form"
version = "1.6.8"
version = "1.9.2"
description = "Core abstractions for kiota generated libraries in Python"
optional = false
python-versions = "<4.0,>=3.8"
python-versions = "<4.0,>=3.9"
files = [
{file = "microsoft_kiota_serialization_form-1.6.8-py3-none-any.whl", hash = "sha256:ca7dd19e173aa87c68b38c5056cc0921570c8c86f3ba5511d1616cf97e2f0f67"},
{file = "microsoft_kiota_serialization_form-1.6.8.tar.gz", hash = "sha256:bb9eb98b3abf596b4bfe208014dff948361ff48a757316ac58e19c31ab8d640a"},
{file = "microsoft_kiota_serialization_form-1.9.2-py3-none-any.whl", hash = "sha256:7b997efb2c8750b1d4fbc00878ba2a3e6e1df3fcefc8815226c90fcc9c54f218"},
{file = "microsoft_kiota_serialization_form-1.9.2.tar.gz", hash = "sha256:badfbe65d8ec3369bd58b01022d13ef590edf14babeef94188efe3f4ec24fe41"},
]
[package.dependencies]
microsoft-kiota-abstractions = ">=1.6.8,<1.7.0"
pendulum = ">=3.0.0b1"
microsoft-kiota-abstractions = ">=1.9.2,<1.10.0"
[[package]]
name = "microsoft-kiota-serialization-json"
version = "1.6.8"
version = "1.9.2"
description = "Core abstractions for kiota generated libraries in Python"
optional = false
python-versions = "<4.0,>=3.8"
python-versions = "<4.0,>=3.9"
files = [
{file = "microsoft_kiota_serialization_json-1.6.8-py3-none-any.whl", hash = "sha256:2734c2ad64cc089441279e4962f6fedf41af040730f6eab1533890cd5377aff5"},
{file = "microsoft_kiota_serialization_json-1.6.8.tar.gz", hash = "sha256:89e2dd0eb4eaaa6ab74fa89ab5d84c5a53464e73b85eb7085f0aa4560a2b8183"},
{file = "microsoft_kiota_serialization_json-1.9.2-py3-none-any.whl", hash = "sha256:8f4ecf485607fff3df5ce8fa9b9c957bc7f4bff1658b183703e180af753098e3"},
{file = "microsoft_kiota_serialization_json-1.9.2.tar.gz", hash = "sha256:19f7beb69c67b2cb77ca96f77824ee78a693929e20237bb5476ea54f69118bf1"},
]
[package.dependencies]
microsoft-kiota-abstractions = ">=1.6.8,<1.7.0"
pendulum = ">=3.0.0b1"
microsoft-kiota-abstractions = ">=1.9.2,<1.10.0"
[[package]]
name = "microsoft-kiota-serialization-multipart"
version = "1.6.8"
version = "1.9.2"
description = "Core abstractions for kiota generated libraries in Python"
optional = false
python-versions = "<4.0,>=3.8"
python-versions = "<4.0,>=3.9"
files = [
{file = "microsoft_kiota_serialization_multipart-1.6.8-py3-none-any.whl", hash = "sha256:1ecdd15dd1f78aed031d7d1828b6fbc00c633542d863c23f96fdd0a61bfb189a"},
{file = "microsoft_kiota_serialization_multipart-1.6.8.tar.gz", hash = "sha256:3d95c6d7186588af7a1d3aa852ce42077f80487b8b3c60e36fe109a8b4918c03"},
{file = "microsoft_kiota_serialization_multipart-1.9.2-py3-none-any.whl", hash = "sha256:641ad374046f1c7adff90d110bdc68d77418adb1e479a716f4ffea3647f0ead6"},
{file = "microsoft_kiota_serialization_multipart-1.9.2.tar.gz", hash = "sha256:b1851409205668d83f5c7a35a8b6fca974b341985b4a92841e95aaec93b7ca0a"},
]
[package.dependencies]
microsoft-kiota-abstractions = ">=1.6.8,<1.7.0"
pendulum = ">=3.0.0b1"
microsoft-kiota-abstractions = ">=1.9.2,<1.10.0"
[[package]]
name = "microsoft-kiota-serialization-text"
version = "1.6.8"
version = "1.9.2"
description = "Core abstractions for kiota generated libraries in Python"
optional = false
python-versions = "<4.0,>=3.8"
python-versions = "<4.0,>=3.9"
files = [
{file = "microsoft_kiota_serialization_text-1.6.8-py3-none-any.whl", hash = "sha256:4e5e287a614d362f864b5061dca0861c3f70b8792ec72967d1bff23944da1e80"},
{file = "microsoft_kiota_serialization_text-1.6.8.tar.gz", hash = "sha256:687d4858337eaf4f351b12ed1c6c934d869560f54ee3855bfdde589660e07208"},
{file = "microsoft_kiota_serialization_text-1.9.2-py3-none-any.whl", hash = "sha256:6e63129ea29eb9b976f4ed56fc6595d204e29fc309958b639299e9f9f4e5edb4"},
{file = "microsoft_kiota_serialization_text-1.9.2.tar.gz", hash = "sha256:4289508ebac0cefdc4fa21c545051769a9409913972355ccda9116b647f978f2"},
]
[package.dependencies]
microsoft-kiota-abstractions = ">=1.6.8,<1.7.0"
python-dateutil = "2.9.0.post0"
microsoft-kiota-abstractions = ">=1.9.2,<1.10.0"
[[package]]
name = "msal"
@@ -2689,13 +2684,13 @@ dev = ["bumpver", "isort", "mypy", "pylint", "pytest", "yapf"]
[[package]]
name = "msgraph-sdk"
version = "1.17.0"
version = "1.18.0"
description = "The Microsoft Graph Python SDK"
optional = false
python-versions = ">=3.9"
files = [
{file = "msgraph_sdk-1.17.0-py3-none-any.whl", hash = "sha256:5582a258ded19a486ab407a67b5f65d666758a63864da77bd20c2581d1c00fba"},
{file = "msgraph_sdk-1.17.0.tar.gz", hash = "sha256:577e41942b0f794b8cf2f54db030bc039a750a81b515dcd0ba1d66fd961fa7bf"},
{file = "msgraph_sdk-1.18.0-py3-none-any.whl", hash = "sha256:f09b015bb9d7690bc6f30c9a28f9a414107aaf06be4952c27b3653dcdf33f2a3"},
{file = "msgraph_sdk-1.18.0.tar.gz", hash = "sha256:ef49166ada7b459b5a843ceb3d253c1ab99d8987ebf3112147eb6cbcaa101793"},
]
[package.dependencies]
@@ -3151,105 +3146,6 @@ files = [
{file = "pbr-6.1.0.tar.gz", hash = "sha256:788183e382e3d1d7707db08978239965e8b9e4e5ed42669bf4758186734d5f24"},
]
[[package]]
name = "pendulum"
version = "3.0.0"
description = "Python datetimes made easy"
optional = false
python-versions = ">=3.8"
files = [
{file = "pendulum-3.0.0-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2cf9e53ef11668e07f73190c805dbdf07a1939c3298b78d5a9203a86775d1bfd"},
{file = "pendulum-3.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:fb551b9b5e6059377889d2d878d940fd0bbb80ae4810543db18e6f77b02c5ef6"},
{file = "pendulum-3.0.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6c58227ac260d5b01fc1025176d7b31858c9f62595737f350d22124a9a3ad82d"},
{file = "pendulum-3.0.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:60fb6f415fea93a11c52578eaa10594568a6716602be8430b167eb0d730f3332"},
{file = "pendulum-3.0.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b69f6b4dbcb86f2c2fe696ba991e67347bcf87fe601362a1aba6431454b46bde"},
{file = "pendulum-3.0.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:138afa9c373ee450ede206db5a5e9004fd3011b3c6bbe1e57015395cd076a09f"},
{file = "pendulum-3.0.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:83d9031f39c6da9677164241fd0d37fbfc9dc8ade7043b5d6d62f56e81af8ad2"},
{file = "pendulum-3.0.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:0c2308af4033fa534f089595bcd40a95a39988ce4059ccd3dc6acb9ef14ca44a"},
{file = "pendulum-3.0.0-cp310-none-win_amd64.whl", hash = "sha256:9a59637cdb8462bdf2dbcb9d389518c0263799189d773ad5c11db6b13064fa79"},
{file = "pendulum-3.0.0-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:3725245c0352c95d6ca297193192020d1b0c0f83d5ee6bb09964edc2b5a2d508"},
{file = "pendulum-3.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:6c035f03a3e565ed132927e2c1b691de0dbf4eb53b02a5a3c5a97e1a64e17bec"},
{file = "pendulum-3.0.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:597e66e63cbd68dd6d58ac46cb7a92363d2088d37ccde2dae4332ef23e95cd00"},
{file = "pendulum-3.0.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:99a0f8172e19f3f0c0e4ace0ad1595134d5243cf75985dc2233e8f9e8de263ca"},
{file = "pendulum-3.0.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:77d8839e20f54706aed425bec82a83b4aec74db07f26acd039905d1237a5e1d4"},
{file = "pendulum-3.0.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:afde30e8146292b059020fbc8b6f8fd4a60ae7c5e6f0afef937bbb24880bdf01"},
{file = "pendulum-3.0.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:660434a6fcf6303c4efd36713ca9212c753140107ee169a3fc6c49c4711c2a05"},
{file = "pendulum-3.0.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:dee9e5a48c6999dc1106eb7eea3e3a50e98a50651b72c08a87ee2154e544b33e"},
{file = "pendulum-3.0.0-cp311-none-win_amd64.whl", hash = "sha256:d4cdecde90aec2d67cebe4042fd2a87a4441cc02152ed7ed8fb3ebb110b94ec4"},
{file = "pendulum-3.0.0-cp311-none-win_arm64.whl", hash = "sha256:773c3bc4ddda2dda9f1b9d51fe06762f9200f3293d75c4660c19b2614b991d83"},
{file = "pendulum-3.0.0-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:409e64e41418c49f973d43a28afe5df1df4f1dd87c41c7c90f1a63f61ae0f1f7"},
{file = "pendulum-3.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:a38ad2121c5ec7c4c190c7334e789c3b4624798859156b138fcc4d92295835dc"},
{file = "pendulum-3.0.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fde4d0b2024b9785f66b7f30ed59281bd60d63d9213cda0eb0910ead777f6d37"},
{file = "pendulum-3.0.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4b2c5675769fb6d4c11238132962939b960fcb365436b6d623c5864287faa319"},
{file = "pendulum-3.0.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8af95e03e066826f0f4c65811cbee1b3123d4a45a1c3a2b4fc23c4b0dff893b5"},
{file = "pendulum-3.0.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2165a8f33cb15e06c67070b8afc87a62b85c5a273e3aaa6bc9d15c93a4920d6f"},
{file = "pendulum-3.0.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:ad5e65b874b5e56bd942546ea7ba9dd1d6a25121db1c517700f1c9de91b28518"},
{file = "pendulum-3.0.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:17fe4b2c844bbf5f0ece69cfd959fa02957c61317b2161763950d88fed8e13b9"},
{file = "pendulum-3.0.0-cp312-none-win_amd64.whl", hash = "sha256:78f8f4e7efe5066aca24a7a57511b9c2119f5c2b5eb81c46ff9222ce11e0a7a5"},
{file = "pendulum-3.0.0-cp312-none-win_arm64.whl", hash = "sha256:28f49d8d1e32aae9c284a90b6bb3873eee15ec6e1d9042edd611b22a94ac462f"},
{file = "pendulum-3.0.0-cp37-cp37m-macosx_10_12_x86_64.whl", hash = "sha256:d4e2512f4e1a4670284a153b214db9719eb5d14ac55ada5b76cbdb8c5c00399d"},
{file = "pendulum-3.0.0-cp37-cp37m-macosx_11_0_arm64.whl", hash = "sha256:3d897eb50883cc58d9b92f6405245f84b9286cd2de6e8694cb9ea5cb15195a32"},
{file = "pendulum-3.0.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2e169cc2ca419517f397811bbe4589cf3cd13fca6dc38bb352ba15ea90739ebb"},
{file = "pendulum-3.0.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f17c3084a4524ebefd9255513692f7e7360e23c8853dc6f10c64cc184e1217ab"},
{file = "pendulum-3.0.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:826d6e258052715f64d05ae0fc9040c0151e6a87aae7c109ba9a0ed930ce4000"},
{file = "pendulum-3.0.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2aae97087872ef152a0c40e06100b3665d8cb86b59bc8471ca7c26132fccd0f"},
{file = "pendulum-3.0.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:ac65eeec2250d03106b5e81284ad47f0d417ca299a45e89ccc69e36130ca8bc7"},
{file = "pendulum-3.0.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:a5346d08f3f4a6e9e672187faa179c7bf9227897081d7121866358af369f44f9"},
{file = "pendulum-3.0.0-cp37-none-win_amd64.whl", hash = "sha256:235d64e87946d8f95c796af34818c76e0f88c94d624c268693c85b723b698aa9"},
{file = "pendulum-3.0.0-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:6a881d9c2a7f85bc9adafcfe671df5207f51f5715ae61f5d838b77a1356e8b7b"},
{file = "pendulum-3.0.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:d7762d2076b9b1cb718a6631ad6c16c23fc3fac76cbb8c454e81e80be98daa34"},
{file = "pendulum-3.0.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e8e36a8130819d97a479a0e7bf379b66b3b1b520e5dc46bd7eb14634338df8c"},
{file = "pendulum-3.0.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7dc843253ac373358ffc0711960e2dd5b94ab67530a3e204d85c6e8cb2c5fa10"},
{file = "pendulum-3.0.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0a78ad3635d609ceb1e97d6aedef6a6a6f93433ddb2312888e668365908c7120"},
{file = "pendulum-3.0.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b30a137e9e0d1f751e60e67d11fc67781a572db76b2296f7b4d44554761049d6"},
{file = "pendulum-3.0.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:c95984037987f4a457bb760455d9ca80467be792236b69d0084f228a8ada0162"},
{file = "pendulum-3.0.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:d29c6e578fe0f893766c0d286adbf0b3c726a4e2341eba0917ec79c50274ec16"},
{file = "pendulum-3.0.0-cp38-none-win_amd64.whl", hash = "sha256:deaba8e16dbfcb3d7a6b5fabdd5a38b7c982809567479987b9c89572df62e027"},
{file = "pendulum-3.0.0-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:b11aceea5b20b4b5382962b321dbc354af0defe35daa84e9ff3aae3c230df694"},
{file = "pendulum-3.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a90d4d504e82ad236afac9adca4d6a19e4865f717034fc69bafb112c320dcc8f"},
{file = "pendulum-3.0.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:825799c6b66e3734227756fa746cc34b3549c48693325b8b9f823cb7d21b19ac"},
{file = "pendulum-3.0.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad769e98dc07972e24afe0cff8d365cb6f0ebc7e65620aa1976fcfbcadc4c6f3"},
{file = "pendulum-3.0.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a6fc26907eb5fb8cc6188cc620bc2075a6c534d981a2f045daa5f79dfe50d512"},
{file = "pendulum-3.0.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0c717eab1b6d898c00a3e0fa7781d615b5c5136bbd40abe82be100bb06df7a56"},
{file = "pendulum-3.0.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:3ddd1d66d1a714ce43acfe337190be055cdc221d911fc886d5a3aae28e14b76d"},
{file = "pendulum-3.0.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:822172853d7a9cf6da95d7b66a16c7160cb99ae6df55d44373888181d7a06edc"},
{file = "pendulum-3.0.0-cp39-none-win_amd64.whl", hash = "sha256:840de1b49cf1ec54c225a2a6f4f0784d50bd47f68e41dc005b7f67c7d5b5f3ae"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:3b1f74d1e6ffe5d01d6023870e2ce5c2191486928823196f8575dcc786e107b1"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:729e9f93756a2cdfa77d0fc82068346e9731c7e884097160603872686e570f07"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e586acc0b450cd21cbf0db6bae386237011b75260a3adceddc4be15334689a9a"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:22e7944ffc1f0099a79ff468ee9630c73f8c7835cd76fdb57ef7320e6a409df4"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:fa30af36bd8e50686846bdace37cf6707bdd044e5cb6e1109acbad3277232e04"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:440215347b11914ae707981b9a57ab9c7b6983ab0babde07063c6ee75c0dc6e7"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:314c4038dc5e6a52991570f50edb2f08c339debdf8cea68ac355b32c4174e820"},
{file = "pendulum-3.0.0-pp37-pypy37_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5acb1d386337415f74f4d1955c4ce8d0201978c162927d07df8eb0692b2d8533"},
{file = "pendulum-3.0.0-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a789e12fbdefaffb7b8ac67f9d8f22ba17a3050ceaaa635cd1cc4645773a4b1e"},
{file = "pendulum-3.0.0-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:860aa9b8a888e5913bd70d819306749e5eb488e6b99cd6c47beb701b22bdecf5"},
{file = "pendulum-3.0.0-pp37-pypy37_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:5ebc65ea033ef0281368217fbf59f5cb05b338ac4dd23d60959c7afcd79a60a0"},
{file = "pendulum-3.0.0-pp37-pypy37_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:d9fef18ab0386ef6a9ac7bad7e43ded42c83ff7ad412f950633854f90d59afa8"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-macosx_10_12_x86_64.whl", hash = "sha256:1c134ba2f0571d0b68b83f6972e2307a55a5a849e7dac8505c715c531d2a8795"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:385680812e7e18af200bb9b4a49777418c32422d05ad5a8eb85144c4a285907b"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9eec91cd87c59fb32ec49eb722f375bd58f4be790cae11c1b70fac3ee4f00da0"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4386bffeca23c4b69ad50a36211f75b35a4deb6210bdca112ac3043deb7e494a"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:dfbcf1661d7146d7698da4b86e7f04814221081e9fe154183e34f4c5f5fa3bf8"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:04a1094a5aa1daa34a6b57c865b25f691848c61583fb22722a4df5699f6bf74c"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:5b0ec85b9045bd49dd3a3493a5e7ddfd31c36a2a60da387c419fa04abcaecb23"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:0a15b90129765b705eb2039062a6daf4d22c4e28d1a54fa260892e8c3ae6e157"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:bb8f6d7acd67a67d6fedd361ad2958ff0539445ef51cbe8cd288db4306503cd0"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fd69b15374bef7e4b4440612915315cc42e8575fcda2a3d7586a0d88192d0c88"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dc00f8110db6898360c53c812872662e077eaf9c75515d53ecc65d886eec209a"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:83a44e8b40655d0ba565a5c3d1365d27e3e6778ae2a05b69124db9e471255c4a"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:1a3604e9fbc06b788041b2a8b78f75c243021e0f512447806a6d37ee5214905d"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:92c307ae7accebd06cbae4729f0ba9fa724df5f7d91a0964b1b972a22baa482b"},
{file = "pendulum-3.0.0.tar.gz", hash = "sha256:5d034998dea404ec31fae27af6b22cff1708f830a1ed7353be4d1019bb9f584e"},
]
[package.dependencies]
python-dateutil = ">=2.6"
tzdata = ">=2020.1"
[package.extras]
test = ["time-machine (>=2.6.0)"]
[[package]]
name = "platformdirs"
version = "4.3.6"
@@ -3462,7 +3358,7 @@ files = [
[[package]]
name = "prowler"
version = "5.2.2"
version = "5.3.0"
description = "Prowler is an Open Source security tool to perform AWS, GCP and Azure security best practices assessments, audits, incident response, continuous monitoring, hardening and forensics readiness. It contains hundreds of controls covering CIS, NIST 800, NIST CSF, CISA, RBI, FedRAMP, PCI-DSS, GDPR, HIPAA, FFIEC, SOC2, GXP, AWS Well-Architected Framework Security Pillar, AWS Foundational Technical Review (FTR), ENS (Spanish National Security Scheme) and your custom security frameworks."
optional = false
python-versions = ">=3.9,<3.13"
@@ -3490,8 +3386,8 @@ azure-mgmt-security = "7.0.0"
azure-mgmt-sql = "3.0.1"
azure-mgmt-storage = "21.2.1"
azure-mgmt-subscription = "3.1.1"
azure-mgmt-web = "7.3.1"
azure-storage-blob = "12.24.0"
azure-mgmt-web = "8.0.0"
azure-storage-blob = "12.24.1"
boto3 = "1.35.99"
botocore = "1.35.99"
colorama = "0.4.6"
@@ -3499,18 +3395,18 @@ cryptography = "43.0.1"
dash = "2.18.2"
dash-bootstrap-components = "1.6.0"
detect-secrets = "1.5.0"
google-api-python-client = "2.159.0"
google-api-python-client = "2.160.0"
google-auth-httplib2 = ">=0.1,<0.3"
jsonschema = "4.23.0"
kubernetes = "31.0.0"
microsoft-kiota-abstractions = "1.6.8"
msgraph-sdk = "1.17.0"
kubernetes = "32.0.0"
microsoft-kiota-abstractions = "1.9.2"
msgraph-sdk = "1.18.0"
numpy = "2.0.2"
pandas = "2.2.3"
py-ocsf-models = "0.2.0"
py-ocsf-models = "0.3.0"
pydantic = "1.10.18"
python-dateutil = "^2.9.0.post0"
pytz = "2024.2"
pytz = "2025.1"
schema = "0.7.7"
shodan = "1.31.0"
slack-sdk = "3.34.0"
@@ -3520,8 +3416,8 @@ tzlocal = "5.2"
[package.source]
type = "git"
url = "https://github.com/prowler-cloud/prowler.git"
reference = "v5.2"
resolved_reference = "1a5428445aaea28b4a6425f7aa8c76838f46f171"
reference = "v5.3"
resolved_reference = "dfd377f89e034e6be47acc95c3e6666278f8054e"
[[package]]
name = "psutil"
@@ -3585,7 +3481,6 @@ files = [
{file = "psycopg2_binary-2.9.9-cp311-cp311-win32.whl", hash = "sha256:dc4926288b2a3e9fd7b50dc6a1909a13bbdadfc67d93f3374d984e56f885579d"},
{file = "psycopg2_binary-2.9.9-cp311-cp311-win_amd64.whl", hash = "sha256:b76bedd166805480ab069612119ea636f5ab8f8771e640ae103e05a4aae3e417"},
{file = "psycopg2_binary-2.9.9-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:8532fd6e6e2dc57bcb3bc90b079c60de896d2128c5d9d6f24a63875a95a088cf"},
{file = "psycopg2_binary-2.9.9-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:b0605eaed3eb239e87df0d5e3c6489daae3f7388d455d0c0b4df899519c6a38d"},
{file = "psycopg2_binary-2.9.9-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f8544b092a29a6ddd72f3556a9fcf249ec412e10ad28be6a0c0d948924f2212"},
{file = "psycopg2_binary-2.9.9-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2d423c8d8a3c82d08fe8af900ad5b613ce3632a1249fd6a223941d0735fce493"},
{file = "psycopg2_binary-2.9.9-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2e5afae772c00980525f6d6ecf7cbca55676296b580c0e6abb407f15f3706996"},
@@ -3594,8 +3489,6 @@ files = [
{file = "psycopg2_binary-2.9.9-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:cb16c65dcb648d0a43a2521f2f0a2300f40639f6f8c1ecbc662141e4e3e1ee07"},
{file = "psycopg2_binary-2.9.9-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:911dda9c487075abd54e644ccdf5e5c16773470a6a5d3826fda76699410066fb"},
{file = "psycopg2_binary-2.9.9-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:57fede879f08d23c85140a360c6a77709113efd1c993923c59fde17aa27599fe"},
{file = "psycopg2_binary-2.9.9-cp312-cp312-win32.whl", hash = "sha256:64cf30263844fa208851ebb13b0732ce674d8ec6a0c86a4e160495d299ba3c93"},
{file = "psycopg2_binary-2.9.9-cp312-cp312-win_amd64.whl", hash = "sha256:81ff62668af011f9a48787564ab7eded4e9fb17a4a6a74af5ffa6a457400d2ab"},
{file = "psycopg2_binary-2.9.9-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:2293b001e319ab0d869d660a704942c9e2cce19745262a8aba2115ef41a0a42a"},
{file = "psycopg2_binary-2.9.9-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:03ef7df18daf2c4c07e2695e8cfd5ee7f748a1d54d802330985a78d2a5a6dca9"},
{file = "psycopg2_binary-2.9.9-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a602ea5aff39bb9fac6308e9c9d82b9a35c2bf288e184a816002c9fae930b77"},
@@ -3635,13 +3528,13 @@ files = [
[[package]]
name = "py-ocsf-models"
version = "0.2.0"
version = "0.3.0"
description = "This is a Python implementation of the OCSF models. The models are used to represent the data of the OCSF Schema defined in https://schema.ocsf.io/."
optional = false
python-versions = "<3.13,>=3.9"
files = [
{file = "py_ocsf_models-0.2.0-py3-none-any.whl", hash = "sha256:ac75fd21077694b343ebaad3479194db113c274879b114277560ff287d5cd7b5"},
{file = "py_ocsf_models-0.2.0.tar.gz", hash = "sha256:3e12648d05329e6776a0e6b1ffea87a3eb60aa7d8cb2c4afd69e5724f443ce03"},
{file = "py_ocsf_models-0.3.0-py3-none-any.whl", hash = "sha256:3d31e379be5e4271f7faf62dee9c36798559a1f7f98dff142c0e4cfdb35e291c"},
{file = "py_ocsf_models-0.3.0.tar.gz", hash = "sha256:ad46b7d9761b74010f06a894df2d9541989252b7ff738cd5c7edbf4283df2279"},
]
[package.dependencies]
@@ -4069,13 +3962,13 @@ files = [
[[package]]
name = "pytz"
version = "2024.2"
version = "2025.1"
description = "World timezone definitions, modern and historical"
optional = false
python-versions = "*"
files = [
{file = "pytz-2024.2-py2.py3-none-any.whl", hash = "sha256:31c7c1817eb7fae7ca4b8c7ee50c72f93aa2dd863de768e1ef4245d426aa0725"},
{file = "pytz-2024.2.tar.gz", hash = "sha256:2aa355083c50a0f93fa581709deac0c9ad65cca8a9e9beac660adcbd493c798a"},
{file = "pytz-2025.1-py2.py3-none-any.whl", hash = "sha256:89dd22dca55b46eac6eda23b2d72721bf1bdfef212645d81513ef5d03038de57"},
{file = "pytz-2025.1.tar.gz", hash = "sha256:c2db42be2a2518b28e65f9207c4d05e6ff547d1efa4086469ef855e4ab70178e"},
]
[[package]]
@@ -5152,4 +5045,4 @@ type = ["pytest-mypy"]
[metadata]
lock-version = "2.0"
python-versions = ">=3.11,<3.13"
content-hash = "4d015c14bb849873c82fd3804c55013c51860689bba99fa095b9896062903357"
content-hash = "f6be3280891827b3162055248a10561638ee42f406f048d7a886cd3532f714a5"

View File

@@ -8,7 +8,7 @@ description = "Prowler's API (Django/DRF)"
license = "Apache-2.0"
name = "prowler-api"
package-mode = false
version = "1.3.2"
version = "1.4.0"
[tool.poetry.dependencies]
celery = {extras = ["pytest"], version = "^5.4.0"}
@@ -27,7 +27,7 @@ drf-nested-routers = "^0.94.1"
drf-spectacular = "0.27.2"
drf-spectacular-jsonapi = "0.5.1"
gunicorn = "23.0.0"
prowler = {git = "https://github.com/prowler-cloud/prowler.git", branch = "v5.2"}
prowler = {git = "https://github.com/prowler-cloud/prowler.git", branch = "v5.3"}
psycopg2-binary = "2.9.9"
pytest-celery = {extras = ["redis"], version = "^1.0.1"}
# Needed for prowler compatibility

View File

@@ -1,4 +1,4 @@
from datetime import date, datetime, timezone
from datetime import date, datetime, timedelta, timezone
from django.conf import settings
from django.db.models import Q
@@ -346,8 +346,14 @@ class FindingFilter(FilterSet):
inserted_at = DateFilter(method="filter_inserted_at", lookup_expr="date")
inserted_at__date = DateFilter(method="filter_inserted_at", lookup_expr="date")
inserted_at__gte = DateFilter(method="filter_inserted_at_gte")
inserted_at__lte = DateFilter(method="filter_inserted_at_lte")
inserted_at__gte = DateFilter(
method="filter_inserted_at_gte",
help_text=f"Maximum date range is {settings.FINDINGS_MAX_DAYS_IN_RANGE} days.",
)
inserted_at__lte = DateFilter(
method="filter_inserted_at_lte",
help_text=f"Maximum date range is {settings.FINDINGS_MAX_DAYS_IN_RANGE} days.",
)
class Meta:
model = Finding
@@ -375,6 +381,52 @@ class FindingFilter(FilterSet):
},
}
def filter_queryset(self, queryset):
if not (self.data.get("scan") or self.data.get("scan__in")) and not (
self.data.get("inserted_at")
or self.data.get("inserted_at__date")
or self.data.get("inserted_at__gte")
or self.data.get("inserted_at__lte")
):
raise ValidationError(
[
{
"detail": "At least one date filter is required: filter[inserted_at], filter[inserted_at.gte], "
"or filter[inserted_at.lte].",
"status": 400,
"source": {"pointer": "/data/attributes/inserted_at"},
"code": "required",
}
]
)
gte_date = (
datetime.strptime(self.data.get("inserted_at__gte"), "%Y-%m-%d").date()
if self.data.get("inserted_at__gte")
else datetime.now(timezone.utc).date()
)
lte_date = (
datetime.strptime(self.data.get("inserted_at__lte"), "%Y-%m-%d").date()
if self.data.get("inserted_at__lte")
else datetime.now(timezone.utc).date()
)
if abs(lte_date - gte_date) > timedelta(
days=settings.FINDINGS_MAX_DAYS_IN_RANGE
):
raise ValidationError(
[
{
"detail": f"The date range cannot exceed {settings.FINDINGS_MAX_DAYS_IN_RANGE} days.",
"status": 400,
"source": {"pointer": "/data/attributes/inserted_at"},
"code": "invalid",
}
]
)
return super().filter_queryset(queryset)
# Convert filter values to UUIDv7 values for use with partitioning
def filter_scan_id(self, queryset, name, value):
try:

View File

@@ -0,0 +1,64 @@
import json
from datetime import datetime, timedelta, timezone
import django.db.models.deletion
from django.db import migrations, models
from django_celery_beat.models import PeriodicTask
from api.db_utils import rls_transaction
from api.models import Scan, StateChoices
def migrate_daily_scheduled_scan_tasks(apps, schema_editor):
for daily_scheduled_scan_task in PeriodicTask.objects.filter(
task="scan-perform-scheduled"
):
task_kwargs = json.loads(daily_scheduled_scan_task.kwargs)
tenant_id = task_kwargs["tenant_id"]
provider_id = task_kwargs["provider_id"]
current_time = datetime.now(timezone.utc)
scheduled_time_today = datetime.combine(
current_time.date(),
daily_scheduled_scan_task.start_time.time(),
tzinfo=timezone.utc,
)
if current_time < scheduled_time_today:
next_scan_date = scheduled_time_today
else:
next_scan_date = scheduled_time_today + timedelta(days=1)
with rls_transaction(tenant_id):
Scan.objects.create(
tenant_id=tenant_id,
name="Daily scheduled scan",
provider_id=provider_id,
trigger=Scan.TriggerChoices.SCHEDULED,
state=StateChoices.SCHEDULED,
scheduled_at=next_scan_date,
scheduler_task_id=daily_scheduled_scan_task.id,
)
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0007_scan_and_scan_summaries_indexes"),
("django_celery_beat", "0019_alter_periodictasks_options"),
]
operations = [
migrations.AddField(
model_name="scan",
name="scheduler_task",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.CASCADE,
to="django_celery_beat.periodictask",
),
),
migrations.RunPython(migrate_daily_scheduled_scan_tasks),
]

View File

@@ -0,0 +1,22 @@
# Generated by Django 5.1.5 on 2025-02-07 09:42
import django.core.validators
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0008_daily_scheduled_tasks_update"),
]
operations = [
migrations.AlterField(
model_name="provider",
name="uid",
field=models.CharField(
max_length=250,
validators=[django.core.validators.MinLengthValidator(3)],
verbose_name="Unique identifier for the provider, set by the provider",
),
),
]

View File

@@ -11,6 +11,7 @@ from django.core.validators import MinLengthValidator
from django.db import models
from django.db.models import Q
from django.utils.translation import gettext_lazy as _
from django_celery_beat.models import PeriodicTask
from django_celery_results.models import TaskResult
from psqlextra.manager import PostgresManager
from psqlextra.models import PostgresPartitionedModel
@@ -226,13 +227,13 @@ class Provider(RowLevelSecurityProtectedModel):
@staticmethod
def validate_kubernetes_uid(value):
if not re.match(
r"(^[a-z0-9]([-a-z0-9]{1,61}[a-z0-9])?$)|(^arn:aws(-cn|-us-gov|-iso|-iso-b)?:[a-zA-Z0-9\-]+:([a-z]{2}-[a-z]+-\d{1})?:(\d{12})?:[a-zA-Z0-9\-_\/:\.\*]+(:\d+)?$)",
r"^[a-z0-9][A-Za-z0-9_.:\/-]{1,250}$",
value,
):
raise ModelValidationError(
detail="The value must either be a valid Kubernetes UID (up to 63 characters, "
"starting and ending with a lowercase letter or number, containing only "
"lowercase alphanumeric characters and hyphens) or a valid EKS ARN.",
"lowercase alphanumeric characters and hyphens) or a valid AWS EKS Cluster ARN, GCP GKE Context Name or Azure AKS Cluster Name.",
code="kubernetes-uid",
pointer="/data/attributes/uid",
)
@@ -246,7 +247,7 @@ class Provider(RowLevelSecurityProtectedModel):
)
uid = models.CharField(
"Unique identifier for the provider, set by the provider",
max_length=63,
max_length=250,
blank=False,
validators=[MinLengthValidator(3)],
)
@@ -410,6 +411,9 @@ class Scan(RowLevelSecurityProtectedModel):
started_at = models.DateTimeField(null=True, blank=True)
completed_at = models.DateTimeField(null=True, blank=True)
next_scan_at = models.DateTimeField(null=True, blank=True)
scheduler_task = models.ForeignKey(
PeriodicTask, on_delete=models.CASCADE, null=True, blank=True
)
# TODO: mutelist foreign key
class Meta(RowLevelSecurityProtectedModel.Meta):

View File

@@ -1,7 +1,7 @@
openapi: 3.0.3
info:
title: Prowler API
version: 1.3.2
version: 1.4.0
description: |-
Prowler API specification.
@@ -346,6 +346,9 @@ paths:
schema:
type: string
format: date
description: At least one of the variations of the `filter[inserted_at]` filter
must be provided.
required: true
- in: query
name: filter[inserted_at__date]
schema:
@@ -356,11 +359,13 @@ paths:
schema:
type: string
format: date
description: Maximum date range is 7 days.
- in: query
name: filter[inserted_at__lte]
schema:
type: string
format: date
description: Maximum date range is 7 days.
- in: query
name: filter[provider]
schema:
@@ -861,11 +866,13 @@ paths:
schema:
type: string
format: date
description: Maximum date range is 7 days.
- in: query
name: filter[inserted_at__lte]
schema:
type: string
format: date
description: Maximum date range is 7 days.
- in: query
name: filter[provider]
schema:
@@ -1275,6 +1282,9 @@ paths:
schema:
type: string
format: date
description: At least one of the variations of the `filter[inserted_at]` filter
must be provided.
required: true
- in: query
name: filter[inserted_at__date]
schema:
@@ -1285,11 +1295,13 @@ paths:
schema:
type: string
format: date
description: Maximum date range is 7 days.
- in: query
name: filter[inserted_at__lte]
schema:
type: string
format: date
description: Maximum date range is 7 days.
- in: query
name: filter[provider]
schema:
@@ -7665,7 +7677,7 @@ components:
uid:
type: string
title: Unique identifier for the provider, set by the provider
maxLength: 63
maxLength: 250
minLength: 3
alias:
type: string
@@ -7777,7 +7789,7 @@ components:
uid:
type: string
title: Unique identifier for the provider, set by the provider
maxLength: 63
maxLength: 250
minLength: 3
required:
- uid
@@ -7821,7 +7833,7 @@ components:
type: string
minLength: 3
title: Unique identifier for the provider, set by the provider
maxLength: 63
maxLength: 250
required:
- uid
required:

View File

@@ -5,6 +5,7 @@ from unittest.mock import ANY, Mock, patch
import jwt
import pytest
from conftest import API_JSON_CONTENT_TYPE, TEST_PASSWORD, TEST_USER
from django.conf import settings
from django.urls import reverse
from rest_framework import status
@@ -27,6 +28,12 @@ from api.rls import Tenant
TODAY = str(datetime.today().date())
def today_after_n_days(n_days: int) -> str:
return datetime.strftime(
datetime.today().date() + timedelta(days=n_days), "%Y-%m-%d"
)
@pytest.mark.django_db
class TestUserViewSet:
def test_users_list(self, authenticated_client, create_test_user):
@@ -878,6 +885,16 @@ class TestProviderViewSet:
"uid": "kubernetes-test-123456789",
"alias": "test",
},
{
"provider": "kubernetes",
"uid": "arn:aws:eks:us-east-1:111122223333:cluster/test-cluster-long-name-123456789",
"alias": "EKS",
},
{
"provider": "kubernetes",
"uid": "gke_aaaa-dev_europe-test1_dev-aaaa-test-cluster-long-name-123456789",
"alias": "GKE",
},
{
"provider": "azure",
"uid": "8851db6b-42e5-4533-aa9e-30a32d67e875",
@@ -2379,12 +2396,33 @@ class TestResourceViewSet:
@pytest.mark.django_db
class TestFindingViewSet:
def test_findings_list_none(self, authenticated_client):
response = authenticated_client.get(reverse("finding-list"))
response = authenticated_client.get(
reverse("finding-list"), {"filter[inserted_at]": TODAY}
)
assert response.status_code == status.HTTP_200_OK
assert len(response.json()["data"]) == 0
def test_findings_list(self, authenticated_client, findings_fixture):
def test_findings_list_no_date_filter(self, authenticated_client):
response = authenticated_client.get(reverse("finding-list"))
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert response.json()["errors"][0]["code"] == "required"
def test_findings_date_range_too_large(self, authenticated_client):
response = authenticated_client.get(
reverse("finding-list"),
{
"filter[inserted_at.lte]": today_after_n_days(
-(settings.FINDINGS_MAX_DAYS_IN_RANGE + 1)
),
},
)
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert response.json()["errors"][0]["code"] == "invalid"
def test_findings_list(self, authenticated_client, findings_fixture):
response = authenticated_client.get(
reverse("finding-list"), {"filter[inserted_at]": TODAY}
)
assert response.status_code == status.HTTP_200_OK
assert len(response.json()["data"]) == len(findings_fixture)
assert (
@@ -2404,7 +2442,8 @@ class TestFindingViewSet:
self, include_values, expected_resources, authenticated_client, findings_fixture
):
response = authenticated_client.get(
reverse("finding-list"), {"include": include_values}
reverse("finding-list"),
{"include": include_values, "filter[inserted_at]": TODAY},
)
assert response.status_code == status.HTTP_200_OK
assert len(response.json()["data"]) == len(findings_fixture)
@@ -2439,13 +2478,13 @@ class TestFindingViewSet:
("service.icontains", "ec", 1),
("inserted_at", "2024-01-01", 0),
("inserted_at.date", "2024-01-01", 0),
("inserted_at.gte", "2024-01-01", 2),
("inserted_at.gte", today_after_n_days(-1), 2),
(
"inserted_at.lte",
"2028-12-31",
today_after_n_days(1),
2,
), # TODO: To avoid having to modify this value and to ensure that the tests always work, we should set the time before the fixtures are inserted
("updated_at.lte", "2024-01-01", 0),
),
("updated_at.lte", today_after_n_days(-1), 0),
("resource_type.icontains", "prowler", 2),
# full text search on finding
("search", "dev-qa", 1),
@@ -2475,9 +2514,13 @@ class TestFindingViewSet:
filter_value,
expected_count,
):
filters = {f"filter[{filter_name}]": filter_value}
if "inserted_at" not in filter_name:
filters["filter[inserted_at]"] = TODAY
response = authenticated_client.get(
reverse("finding-list"),
{f"filter[{filter_name}]": filter_value},
filters,
)
assert response.status_code == status.HTTP_200_OK
@@ -2486,9 +2529,7 @@ class TestFindingViewSet:
def test_finding_filter_by_scan_id(self, authenticated_client, findings_fixture):
response = authenticated_client.get(
reverse("finding-list"),
{
"filter[scan]": findings_fixture[0].scan.id,
},
{"filter[scan]": findings_fixture[0].scan.id},
)
assert response.status_code == status.HTTP_200_OK
assert len(response.json()["data"]) == 2
@@ -2511,6 +2552,7 @@ class TestFindingViewSet:
reverse("finding-list"),
{
"filter[provider]": findings_fixture[0].scan.provider.id,
"filter[inserted_at]": TODAY,
},
)
assert response.status_code == status.HTTP_200_OK
@@ -2525,7 +2567,8 @@ class TestFindingViewSet:
"filter[provider.in]": [
findings_fixture[0].scan.provider.id,
findings_fixture[1].scan.provider.id,
]
],
"filter[inserted_at]": TODAY,
},
)
assert response.status_code == status.HTTP_200_OK
@@ -2559,13 +2602,13 @@ class TestFindingViewSet:
)
def test_findings_sort(self, authenticated_client, sort_field):
response = authenticated_client.get(
reverse("finding-list"), {"sort": sort_field}
reverse("finding-list"), {"sort": sort_field, "filter[inserted_at]": TODAY}
)
assert response.status_code == status.HTTP_200_OK
def test_findings_sort_invalid(self, authenticated_client):
response = authenticated_client.get(
reverse("finding-list"), {"sort": "invalid"}
reverse("finding-list"), {"sort": "invalid", "filter[inserted_at]": TODAY}
)
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert response.json()["errors"][0]["code"] == "invalid"
@@ -2633,7 +2676,7 @@ class TestFindingViewSet:
reverse("finding-metadata"),
{
"filter[severity__in]": ["low", "medium"],
"filter[inserted_at]": finding_1.updated_at.strftime("%Y-%m-%d"),
"filter[inserted_at]": finding_1.inserted_at.strftime("%Y-%m-%d"),
},
)
data = response.json()

View File

@@ -1,4 +1,3 @@
from django.conf import settings
from django.urls import include, path
from drf_spectacular.views import SpectacularRedocView
from rest_framework_nested import routers
@@ -113,6 +112,3 @@ urlpatterns = [
path("schema", SchemaView.as_view(), name="schema"),
path("docs", SpectacularRedocView.as_view(url_name="schema"), name="docs"),
]
if settings.DEBUG:
urlpatterns += [path("silk/", include("silk.urls", namespace="silk"))]

View File

@@ -193,7 +193,7 @@ class SchemaView(SpectacularAPIView):
def get(self, request, *args, **kwargs):
spectacular_settings.TITLE = "Prowler API"
spectacular_settings.VERSION = "1.3.2"
spectacular_settings.VERSION = "1.4.0"
spectacular_settings.DESCRIPTION = (
"Prowler API specification.\n\nThis file is auto-generated."
)
@@ -1271,6 +1271,14 @@ class ResourceViewSet(BaseRLSViewSet):
tags=["Finding"],
summary="List all findings",
description="Retrieve a list of all findings with options for filtering by various criteria.",
parameters=[
OpenApiParameter(
name="filter[inserted_at]",
description="At least one of the variations of the `filter[inserted_at]` filter must be provided.",
required=True,
type=OpenApiTypes.DATE,
)
],
),
retrieve=extend_schema(
tags=["Finding"],
@@ -1288,6 +1296,14 @@ class ResourceViewSet(BaseRLSViewSet):
tags=["Finding"],
summary="Retrieve metadata values from findings",
description="Fetch unique metadata values from a set of findings. This is useful for dynamic filtering.",
parameters=[
OpenApiParameter(
name="filter[inserted_at]",
description="At least one of the variations of the `filter[inserted_at]` filter must be provided.",
required=True,
type=OpenApiTypes.DATE,
)
],
filters=True,
),
)
@@ -1364,6 +1380,12 @@ class FindingViewSet(BaseRLSViewSet):
return queryset
def filter_queryset(self, queryset):
# Do not apply filters when retrieving specific finding
if self.action == "retrieve":
return queryset
return super().filter_queryset(queryset)
def inserted_at_to_uuidv7(self, inserted_at):
if inserted_at is None:
return None
@@ -1396,50 +1418,26 @@ class FindingViewSet(BaseRLSViewSet):
queryset = self.get_queryset()
filtered_queryset = self.filter_queryset(queryset)
relevant_resources = Resource.objects.filter(
tenant_id=tenant_id, findings__in=filtered_queryset
).distinct()
filtered_ids = filtered_queryset.order_by().values("id")
services = (
relevant_resources.values_list("service", flat=True)
.distinct()
.order_by("service")
relevant_resources = Resource.all_objects.filter(
tenant_id=tenant_id, findings__id__in=Subquery(filtered_ids)
).only("service", "region", "type")
aggregation = relevant_resources.aggregate(
services=ArrayAgg("service", flat=True),
regions=ArrayAgg("region", flat=True),
resource_types=ArrayAgg("type", flat=True),
)
regions = (
relevant_resources.exclude(region="")
.values_list("region", flat=True)
.distinct()
.order_by("region")
)
resource_types = (
relevant_resources.values_list("type", flat=True)
.distinct()
.order_by("type")
)
# Temporarily disabled until we implement tag filtering in the UI
# tag_data = (
# relevant_resources
# .filter(tags__key__isnull=False, tags__value__isnull=False)
# .exclude(tags__key="")
# .exclude(tags__value="")
# .values("tags__key", "tags__value")
# .distinct()
# .order_by("tags__key", "tags__value")
# )
#
# tags_dict = {}
# for row in tag_data:
# k, v = row["tags__key"], row["tags__value"]
# tags_dict.setdefault(k, []).append(v)
services = sorted(set(aggregation["services"] or []))
regions = sorted({region for region in aggregation["regions"] or [] if region})
resource_types = sorted(set(aggregation["resource_types"] or []))
result = {
"services": list(services),
"regions": list(regions),
"resource_types": list(resource_types),
# "tags": tags_dict
"services": services,
"regions": regions,
"resource_types": resource_types,
}
serializer = self.get_serializer(data=result)

View File

@@ -207,3 +207,4 @@ CACHE_STALE_WHILE_REVALIDATE = env.int("DJANGO_STALE_WHILE_REVALIDATE", 60)
TESTING = False
FINDINGS_MAX_DAYS_IN_RANGE = env.int("DJANGO_FINDINGS_MAX_DAYS_IN_RANGE", 7)

View File

@@ -37,9 +37,3 @@ REST_FRAMEWORK["DEFAULT_FILTER_BACKENDS"] = tuple( # noqa: F405
) + ("api.filters.CustomDjangoFilterBackend",)
SECRETS_ENCRYPTION_KEY = "ZMiYVo7m4Fbe2eXXPyrwxdJss2WSalXSv3xHBcJkPl0="
MIDDLEWARE += [ # noqa: F405
"silk.middleware.SilkyMiddleware",
]
INSTALLED_APPS += ["silk"] # noqa: F405

View File

@@ -5,10 +5,14 @@ from django_celery_beat.models import IntervalSchedule, PeriodicTask
from rest_framework_json_api.serializers import ValidationError
from tasks.tasks import perform_scheduled_scan_task
from api.models import Provider
from api.db_utils import rls_transaction
from api.models import Provider, Scan, StateChoices
def schedule_provider_scan(provider_instance: Provider):
tenant_id = str(provider_instance.tenant_id)
provider_id = str(provider_instance.id)
schedule, _ = IntervalSchedule.objects.get_or_create(
every=24,
period=IntervalSchedule.HOURS,
@@ -17,23 +21,9 @@ def schedule_provider_scan(provider_instance: Provider):
# Create a unique name for the periodic task
task_name = f"scan-perform-scheduled-{provider_instance.id}"
# Schedule the task
_, created = PeriodicTask.objects.get_or_create(
interval=schedule,
name=task_name,
task="scan-perform-scheduled",
kwargs=json.dumps(
{
"tenant_id": str(provider_instance.tenant_id),
"provider_id": str(provider_instance.id),
}
),
one_off=False,
defaults={
"start_time": datetime.now(timezone.utc) + timedelta(hours=24),
},
)
if not created:
if PeriodicTask.objects.filter(
interval=schedule, name=task_name, task="scan-perform-scheduled"
).exists():
raise ValidationError(
[
{
@@ -45,9 +35,36 @@ def schedule_provider_scan(provider_instance: Provider):
]
)
with rls_transaction(tenant_id):
scheduled_scan = Scan.objects.create(
tenant_id=tenant_id,
name="Daily scheduled scan",
provider_id=provider_id,
trigger=Scan.TriggerChoices.SCHEDULED,
state=StateChoices.AVAILABLE,
scheduled_at=datetime.now(timezone.utc),
)
# Schedule the task
periodic_task_instance = PeriodicTask.objects.create(
interval=schedule,
name=task_name,
task="scan-perform-scheduled",
kwargs=json.dumps(
{
"tenant_id": tenant_id,
"provider_id": provider_id,
}
),
one_off=False,
start_time=datetime.now(timezone.utc) + timedelta(hours=24),
)
scheduled_scan.scheduler_task_id = periodic_task_instance.id
scheduled_scan.save()
return perform_scheduled_scan_task.apply_async(
kwargs={
"tenant_id": str(provider_instance.tenant_id),
"provider_id": str(provider_instance.id),
"provider_id": provider_id,
},
)

View File

@@ -245,8 +245,11 @@ def perform_prowler_scan(
status = FindingStatus[finding.status]
delta = _create_finding_delta(last_status, status)
# For the findings prior to the change, when a first finding is found with delta!="new" it will be assigned a current date as first_seen_at and the successive findings with the same UID will always get the date of the previous finding.
# For new findings, when a finding (delta="new") is found for the first time, the first_seen_at attribute will be assigned the current date, the following findings will get that date.
# For the findings prior to the change, when a first finding is found with delta!="new" it will be
# assigned a current date as first_seen_at and the successive findings with the same UID will
# always get the date of the previous finding.
# For new findings, when a finding (delta="new") is found for the first time, the first_seen_at
# attribute will be assigned the current date, the following findings will get that date.
if not last_first_seen_at:
last_first_seen_at = datetime.now(tz=timezone.utc)

View File

@@ -1,15 +1,14 @@
from datetime import datetime, timedelta, timezone
from celery import shared_task
from config.celery import RLSTask
from django_celery_beat.models import PeriodicTask
from tasks.jobs.connection import check_provider_connection
from tasks.jobs.deletion import delete_provider, delete_tenant
from tasks.jobs.scan import aggregate_findings, perform_prowler_scan
from tasks.utils import get_next_execution_datetime
from api.db_utils import rls_transaction
from api.decorators import set_tenant
from api.models import Provider, Scan
from api.models import Scan, StateChoices
@shared_task(base=RLSTask, name="provider-connection-check")
@@ -100,28 +99,42 @@ def perform_scheduled_scan_task(self, tenant_id: str, provider_id: str):
task_id = self.request.id
with rls_transaction(tenant_id):
provider_instance = Provider.objects.get(pk=provider_id)
periodic_task_instance = PeriodicTask.objects.get(
name=f"scan-perform-scheduled-{provider_id}"
)
next_scan_date = datetime.combine(
datetime.now(timezone.utc), periodic_task_instance.start_time.time()
) + timedelta(hours=24)
scan_instance = Scan.objects.create(
next_scan_datetime = get_next_execution_datetime(task_id, provider_id)
scan_instance, _ = Scan.objects.get_or_create(
tenant_id=tenant_id,
name="Daily scheduled scan",
provider=provider_instance,
provider_id=provider_id,
trigger=Scan.TriggerChoices.SCHEDULED,
next_scan_at=next_scan_date,
task_id=task_id,
state__in=(StateChoices.SCHEDULED, StateChoices.AVAILABLE),
scheduler_task_id=periodic_task_instance.id,
defaults={"state": StateChoices.SCHEDULED},
)
result = perform_prowler_scan(
tenant_id=tenant_id,
scan_id=str(scan_instance.id),
provider_id=provider_id,
)
scan_instance.task_id = task_id
scan_instance.save()
try:
result = perform_prowler_scan(
tenant_id=tenant_id,
scan_id=str(scan_instance.id),
provider_id=provider_id,
)
except Exception as e:
raise e
finally:
with rls_transaction(tenant_id):
Scan.objects.get_or_create(
tenant_id=tenant_id,
name="Daily scheduled scan",
provider_id=provider_id,
trigger=Scan.TriggerChoices.SCHEDULED,
state=StateChoices.SCHEDULED,
scheduled_at=next_scan_datetime,
scheduler_task_id=periodic_task_instance.id,
)
perform_scan_summary_task.apply_async(
kwargs={
"tenant_id": tenant_id,

View File

@@ -6,6 +6,8 @@ from django_celery_beat.models import IntervalSchedule, PeriodicTask
from rest_framework_json_api.serializers import ValidationError
from tasks.beat import schedule_provider_scan
from api.models import Scan
@pytest.mark.django_db
class TestScheduleProviderScan:
@@ -15,9 +17,11 @@ class TestScheduleProviderScan:
with patch(
"tasks.tasks.perform_scheduled_scan_task.apply_async"
) as mock_apply_async:
assert Scan.all_objects.count() == 0
result = schedule_provider_scan(provider_instance)
assert result is not None
assert Scan.all_objects.count() == 1
mock_apply_async.assert_called_once_with(
kwargs={

View File

@@ -0,0 +1,76 @@
from datetime import datetime, timedelta, timezone
from unittest.mock import patch
import pytest
from django_celery_beat.models import IntervalSchedule, PeriodicTask
from django_celery_results.models import TaskResult
from tasks.utils import get_next_execution_datetime
@pytest.mark.django_db
class TestGetNextExecutionDatetime:
@pytest.fixture
def setup_periodic_task(self, db):
# Create a periodic task with an hourly interval
interval = IntervalSchedule.objects.create(
every=1, period=IntervalSchedule.HOURS
)
periodic_task = PeriodicTask.objects.create(
name="scan-perform-scheduled-123",
task="scan-perform-scheduled",
interval=interval,
)
return periodic_task
@pytest.fixture
def setup_task_result(self, db):
# Create a task result record
task_result = TaskResult.objects.create(
task_id="abc123",
task_name="scan-perform-scheduled",
status="SUCCESS",
date_created=datetime.now(timezone.utc) - timedelta(hours=1),
result="Success",
)
return task_result
def test_get_next_execution_datetime_success(
self, setup_task_result, setup_periodic_task
):
task_result = setup_task_result
periodic_task = setup_periodic_task
# Mock periodic_task_name on TaskResult
with patch.object(
TaskResult, "periodic_task_name", return_value=periodic_task.name
):
next_execution = get_next_execution_datetime(
task_id=task_result.task_id, provider_id="123"
)
expected_time = task_result.date_created + timedelta(hours=1)
assert next_execution == expected_time
def test_get_next_execution_datetime_fallback_to_provider_id(
self, setup_task_result, setup_periodic_task
):
task_result = setup_task_result
# Simulate the case where `periodic_task_name` is missing
with patch.object(TaskResult, "periodic_task_name", return_value=None):
next_execution = get_next_execution_datetime(
task_id=task_result.task_id, provider_id="123"
)
expected_time = task_result.date_created + timedelta(hours=1)
assert next_execution == expected_time
def test_get_next_execution_datetime_periodic_task_does_not_exist(
self, setup_task_result
):
task_result = setup_task_result
with pytest.raises(PeriodicTask.DoesNotExist):
get_next_execution_datetime(
task_id=task_result.task_id, provider_id="nonexistent"
)

View File

@@ -0,0 +1,26 @@
from datetime import datetime, timedelta, timezone
from django_celery_beat.models import PeriodicTask
from django_celery_results.models import TaskResult
def get_next_execution_datetime(task_id: int, provider_id: str) -> datetime:
task_instance = TaskResult.objects.get(task_id=task_id)
try:
periodic_task_instance = PeriodicTask.objects.get(
name=task_instance.periodic_task_name
)
except PeriodicTask.DoesNotExist:
periodic_task_instance = PeriodicTask.objects.get(
name=f"scan-perform-scheduled-{provider_id}"
)
interval = periodic_task_instance.interval
current_scheduled_time = datetime.combine(
datetime.now(timezone.utc).date(),
task_instance.date_created.time(),
tzinfo=timezone.utc,
)
return current_scheduled_time + timedelta(**{interval.period: interval.every})

View File

@@ -532,8 +532,8 @@ def get_bar_graph(df, column_name):
# Cut the text if it is too long
for i in range(len(colums)):
if len(colums[i]) > 15:
colums[i] = colums[i][:15] + "..."
if len(colums[i]) > 43:
colums[i] = colums[i][:43] + "..."
fig = px.bar(
df,

View File

@@ -1720,7 +1720,7 @@ def generate_table(data, index, color_mapping_severity, color_mapping_status):
[
html.P(
html.Strong(
"Recomendation: ",
"Recommendation: ",
style={
"margin-right": "5px"
},
@@ -1744,7 +1744,7 @@ def generate_table(data, index, color_mapping_severity, color_mapping_status):
[
html.P(
html.Strong(
"RecomendationUrl: ",
"RecommendationUrl: ",
style={
"margin-right": "5px"
},

View File

@@ -35,6 +35,9 @@ services:
required: false
ports:
- 3000:3000
volumes:
- "./ui:/app"
- "/app/node_modules"
postgres:
image: postgres:16.3-alpine3.20

View File

@@ -175,6 +175,7 @@ Due to the complexity and differences of each provider use the rest of the provi
- [GCP](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/gcp/gcp_provider.py)
- [Azure](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/azure/azure_provider.py)
- [Kubernetes](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/kubernetes/kubernetes_provider.py)
- [Microsoft365](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/microsoft365/microsoft365_provider.py)
To facilitate understanding here is a pseudocode of how the most basic provider could be with examples.

View File

@@ -237,3 +237,4 @@ It is really important to check if the current Prowler's permissions for each pr
- AWS: https://docs.prowler.cloud/en/latest/getting-started/requirements/#aws-authentication
- Azure: https://docs.prowler.cloud/en/latest/getting-started/requirements/#permissions
- GCP: https://docs.prowler.cloud/en/latest/getting-started/requirements/#gcp-authentication
- Microsoft365: https://docs.prowler.cloud/en/latest/getting-started/requirements/#microsoft365-authentication

View File

@@ -102,3 +102,32 @@ Those credentials must be associated to a user or service account with proper pe
???+ note
By default, `prowler` will scan all accessible GCP Projects, use flag `--project-ids` to specify the projects to be scanned.
## Microsoft365
Prowler for Microsoft365 currently supports the following authentication types:
- [Service principal application](https://learn.microsoft.com/en-us/entra/identity-platform/app-objects-and-service-principals?tabs=browser#service-principal-object) (recommended).
- Current az cli credentials stored.
- Interactive browser authentication.
???+ warning
For Prowler App only the Service Principal with an application authentication method is supported.
### Service Principal authentication
To allow Prowler assume the service principal identity to start the scan it is needed to configure the following environment variables:
```console
export AZURE_CLIENT_ID="XXXXXXXXX"
export AZURE_CLIENT_SECRET="XXXXXXXXX"
export AZURE_TENANT_ID="XXXXXXXXX"
```
If you try to execute Prowler with the `--sp-env-auth` flag and those variables are empty or not exported, the execution is going to fail.
Follow the instructions in the [Create Prowler Service Principal](../tutorials/azure/create-prowler-service-principal.md) section to create a service principal.
### Interactive Browser authentication
To use `--browser-auth` the user needs to authenticate against Azure using the default browser to start the scan, also `--tenant-id` flag is required.

View File

@@ -164,7 +164,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler/),
* `Python >= 3.9`
* `Python pip >= 21.0.0`
* AWS, GCP, Azure and/or Kubernetes credentials
* AWS, GCP, Azure, Microsoft365 and/or Kubernetes credentials
_Commands_:
@@ -377,6 +377,19 @@ Go to [http://localhost:3000](http://localhost:3000) after installing the app (s
<img src="img/sign-up-button.png" alt="Sign Up Button" width="320"/>
<img src="img/sign-up.png" alt="Sign Up" width="285"/>
???+ note "User creation and default tenant behavior"
When creating a new user, the behavior depends on whether an invitation is provided:
- **Without an invitation**:
- A new tenant is automatically created.
- The new user is assigned to this tenant.
- A set of **RBAC admin permissions** is generated and assigned to the user for the newly created tenant.
- **With an invitation**: The user is added to the specified tenant with the permissions defined in the invitation.
This mechanism ensures that the first user in a newly created tenant has administrative permissions within that tenant.
#### **Log In**
Log in with your email and password to start using the Prowler App.
@@ -404,7 +417,7 @@ While the scan is running, start exploring the findings in these sections:
### Prowler CLI
To run Prowler, you will need to specify the provider (e.g `aws`, `gcp`, `azure` or `kubernetes`):
To run Prowler, you will need to specify the provider (e.g `aws`, `gcp`, `azure`, `microsoft365` or `kubernetes`):
???+ note
If no provider specified, AWS will be used for backward compatibility with most of v2 options.
@@ -535,6 +548,7 @@ prowler kubernetes --kubeconfig-file path
For in-cluster execution, you can use the supplied yaml to run Prowler as a job within a new Prowler namespace:
```console
kubectl apply -f kubernetes/prowler-sa.yaml
kubectl apply -f kubernetes/job.yaml
kubectl apply -f kubernetes/prowler-role.yaml
kubectl apply -f kubernetes/prowler-rolebinding.yaml
@@ -545,5 +559,23 @@ kubectl logs prowler-XXXXX --namespace prowler-ns
???+ note
By default, `prowler` will scan all namespaces in your active Kubernetes context. Use the flag `--context` to specify the context to be scanned and `--namespaces` to specify the namespaces to be scanned.
#### Microsoft365
With Microsoft365 you need to specify which auth method is going to be used:
```console
# To use service principal authentication
prowler microsoft365 --sp-env-auth
# To use az cli authentication
prowler microsoft365 --az-cli-auth
# To use browser authentication
prowler microsoft365 --browser-auth --tenant-id "XXXXXXXX"
```
See more details about Microsoft365 Authentication in [Requirements](getting-started/requirements.md#microsoft365)
## Prowler v2 Documentation
For **Prowler v2 Documentation**, please check it out [here](https://github.com/prowler-cloud/prowler/blob/8818f47333a0c1c1a457453c87af0ea5b89a385f/README.md).

View File

@@ -4,7 +4,7 @@ To allow Prowler assume an identity to start the scan with the required privileg
To create a Service Principal Application you can use the Azure Portal or the Azure CLI.
## From Azure Portal
## From Azure Portal / Entra Admin Center
1. Access to Microsoft Entra ID
2. In the left menu bar, go to "App registrations"

View File

@@ -9,6 +9,7 @@ For in-cluster execution, you can use the supplied yaml files inside `/kubernete
They can be used to run Prowler as a job within a new Prowler namespace:
```console
kubectl apply -f kubernetes/prowler-sa.yaml
kubectl apply -f kubernetes/job.yaml
kubectl apply -f kubernetes/prowler-role.yaml
kubectl apply -f kubernetes/prowler-rolebinding.yaml

View File

@@ -0,0 +1,23 @@
# Microsoft365 authentication
By default Prowler uses MsGraph Python SDK identity package authentication methods using the class `ClientSecretCredential`.
This allows Prowler to authenticate against microsoft365 using the following methods:
- Service principal authentication by environment variables (Enterprise Application)
- Current CLI credentials stored
- Interactive browser authentication
To launch the tool first you need to specify which method is used through the following flags:
```console
# To use service principal authentication
prowler microsoft365 --sp-env-auth
# To use cli authentication
prowler microsoft365 --az-cli-auth
# To use browser authentication
prowler microsoft365 --browser-auth --tenant-id "XXXXXXXX"
```
To use Prowler you need to set up also the permissions required to access your resources in your Microsoft365 account, to more details refer to [Requirements](../../getting-started/requirements.md)

View File

@@ -99,6 +99,32 @@ By default, the `kubeconfig` file is located at `~/.kube/config`.
<img src="../../img/kubernetes-credentials.png" alt="Kubernetes Credentials" width="700"/>
???+ note
If you are adding an **Amazon EKS** cluster, follow these additional steps to ensure proper authentication:
1. Apply the necessary Kubernetes resources to your EKS cluster (you can find the files in the [`kubernetes` directory of the Prowler repository](https://github.com/prowler-cloud/prowler/tree/master/kubernetes)):
```console
kubectl apply -f kubernetes/prowler-sa.yaml
kubectl apply -f kubernetes/prowler-role.yaml
kubectl apply -f kubernetes/prowler-rolebinding.yaml
```
2. Generate a long-lived token for authentication:
```console
kubectl create token prowler-sa -n prowler-ns --duration=0
```
- **Security Note:** The `--duration=0` option generates a non-expiring token, which may pose a security risk if not managed properly. Users should decide on an appropriate expiration time based on their security policies. If a limited-time token is preferred, set `--duration=<TIME>` (e.g., `--duration=24h`).
- **Important:** If the token expires, Prowler Cloud will no longer be able to authenticate with the cluster. In this case, you will need to generate a new token and **remove and re-add the provider in Prowler Cloud** with the updated `kubeconfig`.
3. Update your `kubeconfig` to use the ServiceAccount token:
```console
kubectl config set-credentials prowler-sa --token=<SA_TOKEN>
kubectl config set-context <CLUSTER_ARN> --user=prowler-sa
```
Replace <SA_TOKEN> with the generated token and <CLUSTER_ARN> with your EKS cluster ARN.
4. Now you can add the modified `kubeconfig` as the credentials of the AWS EKS Cluster in Prowler Cloud. Then simply test the connection.
---
## **Step 5: Test Connection**
@@ -133,3 +159,5 @@ While the scan is running, start exploring the findings in these sections:
<img src="../../img/issues.png" alt="Issues" width="300" style="text-align: center;"/>
- **Browse All Findings**: Detailed list of findings detected, where you can filter by severity, service, and more. <img src="../../img/findings.png" alt="Findings" width="700"/>
To view all `new` findings that have not been seen prior to this scan, click the `Delta` filter and select `new`. To view all `changed` findings that have had a status change (from `PASS` to `FAIL` for example), click the `Delta` filter and select `changed`.

View File

@@ -1,16 +1,3 @@
apiVersion: v1
kind: Namespace
metadata:
name: prowler-ns
---
apiVersion: v1
kind: ServiceAccount
metadata:
name: prowler-sa
namespace: prowler-ns
---
apiVersion: batch/v1
kind: Job
metadata:

View File

@@ -0,0 +1,10 @@
apiVersion: v1
kind: Namespace
metadata:
name: prowler-ns
---
apiVersion: v1
kind: ServiceAccount
metadata:
name: prowler-sa
namespace: prowler-ns

View File

@@ -94,6 +94,9 @@ nav:
- In-Cluster Execution: tutorials/kubernetes/in-cluster.md
- Non In-Cluster Execution: tutorials/kubernetes/outside-cluster.md
- Miscellaneous: tutorials/kubernetes/misc.md
- Microsoft 365:
- Authentication: tutorials/microsoft365/authentication.md
- Create Prowler Service Principal: tutorials/microsoft365/create-prowler-service-principal.md
- Developer Guide:
- Introduction: developer-guide/introduction.md
- Provider: developer-guide/provider.md
@@ -124,7 +127,7 @@ extra:
make our documentation better.
analytics:
provider: google
property: G-H5TFH6WJRQ
property: G-KBKV70W5Y2
social:
- icon: fontawesome/brands/github
link: https://github.com/prowler-cloud

460
poetry.lock generated
View File

@@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry 1.8.0 and should not be changed by hand.
# This file is automatically @generated by Poetry 1.8.5 and should not be changed by hand.
[[package]]
name = "about-time"
@@ -209,13 +209,13 @@ files = [
[[package]]
name = "attrs"
version = "24.3.0"
version = "25.1.0"
description = "Classes Without Boilerplate"
optional = false
python-versions = ">=3.8"
files = [
{file = "attrs-24.3.0-py3-none-any.whl", hash = "sha256:ac96cd038792094f438ad1f6ff80837353805ac950cd2aa0e0625ef19850c308"},
{file = "attrs-24.3.0.tar.gz", hash = "sha256:8f5c07333d543103541ba7be0e2ce16eeee8130cb0b3f9238ab904ce1e85baff"},
{file = "attrs-25.1.0-py3-none-any.whl", hash = "sha256:c75a69e28a550a7e93789579c22aa26b0f5b83b75dc4e08fe092980051e1090a"},
{file = "attrs-25.1.0.tar.gz", hash = "sha256:1c97078a80c814273a76b2a298a932eb681c87415c11dee0a6921de7f1b02c3e"},
]
[package.extras]
@@ -228,13 +228,13 @@ tests-mypy = ["mypy (>=1.11.1)", "pytest-mypy-plugins"]
[[package]]
name = "authlib"
version = "1.4.0"
version = "1.4.1"
description = "The ultimate Python library in building OAuth and OpenID Connect servers and clients."
optional = false
python-versions = ">=3.9"
files = [
{file = "Authlib-1.4.0-py2.py3-none-any.whl", hash = "sha256:4bb20b978c8b636222b549317c1815e1fe62234fc1c5efe8855d84aebf3a74e3"},
{file = "authlib-1.4.0.tar.gz", hash = "sha256:1c1e6608b5ed3624aeeee136ca7f8c120d6f51f731aa152b153d54741840e1f2"},
{file = "Authlib-1.4.1-py2.py3-none-any.whl", hash = "sha256:edc29c3f6a3e72cd9e9f45fff67fc663a2c364022eb0371c003f22d5405915c1"},
{file = "authlib-1.4.1.tar.gz", hash = "sha256:30ead9ea4993cdbab821dc6e01e818362f92da290c04c7f6a1940f86507a790d"},
]
[package.dependencies]
@@ -629,13 +629,13 @@ msrest = ">=0.7.1"
[[package]]
name = "azure-mgmt-web"
version = "7.3.1"
version = "8.0.0"
description = "Microsoft Azure Web Apps Management Client Library for Python"
optional = false
python-versions = ">=3.8"
files = [
{file = "azure-mgmt-web-7.3.1.tar.gz", hash = "sha256:87b771436bc99a7a8df59d0ad185b96879a06dce14764a06b3fc3dafa8fcb56b"},
{file = "azure_mgmt_web-7.3.1-py3-none-any.whl", hash = "sha256:ccf881e3ab31c3fdbf9cbff32773d9c0006b5dcd621ea074d7ec89e51049fb72"},
{file = "azure_mgmt_web-8.0.0-py3-none-any.whl", hash = "sha256:0536aac05bfc673b56ed930f2966b77856e84df675d376e782a7af6bb92449af"},
{file = "azure_mgmt_web-8.0.0.tar.gz", hash = "sha256:c8d9c042c09db7aacb20270a9effed4d4e651e365af32d80897b84dc7bf35098"},
]
[package.dependencies]
@@ -646,13 +646,13 @@ typing-extensions = ">=4.6.0"
[[package]]
name = "azure-storage-blob"
version = "12.24.0"
version = "12.24.1"
description = "Microsoft Azure Blob Storage Client Library for Python"
optional = false
python-versions = ">=3.8"
files = [
{file = "azure_storage_blob-12.24.0-py3-none-any.whl", hash = "sha256:4f0bb4592ea79a2d986063696514c781c9e62be240f09f6397986e01755bc071"},
{file = "azure_storage_blob-12.24.0.tar.gz", hash = "sha256:eaaaa1507c8c363d6e1d1342bd549938fdf1adec9b1ada8658c8f5bf3aea844e"},
{file = "azure_storage_blob-12.24.1-py3-none-any.whl", hash = "sha256:77fb823fdbac7f3c11f7d86a5892e2f85e161e8440a7489babe2195bf248f09e"},
{file = "azure_storage_blob-12.24.1.tar.gz", hash = "sha256:052b2a1ea41725ba12e2f4f17be85a54df1129e13ea0321f5a2fcc851cbf47d4"},
]
[package.dependencies]
@@ -666,17 +666,17 @@ aio = ["azure-core[aio] (>=1.30.0)"]
[[package]]
name = "babel"
version = "2.16.0"
version = "2.17.0"
description = "Internationalization utilities"
optional = false
python-versions = ">=3.8"
files = [
{file = "babel-2.16.0-py3-none-any.whl", hash = "sha256:368b5b98b37c06b7daf6696391c3240c938b37767d4584413e8438c5c435fa8b"},
{file = "babel-2.16.0.tar.gz", hash = "sha256:d1f3554ca26605fe173f3de0c65f750f5a42f924499bf134de6423582298e316"},
{file = "babel-2.17.0-py3-none-any.whl", hash = "sha256:4d0b53093fdfb4b21c92b5213dba5a1b23885afa8383709427046b21c366e5f2"},
{file = "babel-2.17.0.tar.gz", hash = "sha256:0c54cffb19f690cdcc52a3b50bcbf71e07a808d1c80d549f2459b9d2cf0afb9d"},
]
[package.extras]
dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
dev = ["backports.zoneinfo", "freezegun (>=1.0,<2.0)", "jinja2 (>=3.0)", "pytest (>=6.0)", "pytest-cov", "pytz", "setuptools", "tzdata"]
[[package]]
name = "bandit"
@@ -802,24 +802,24 @@ crt = ["awscrt (==0.22.0)"]
[[package]]
name = "cachetools"
version = "5.5.0"
version = "5.5.1"
description = "Extensible memoizing collections and decorators"
optional = false
python-versions = ">=3.7"
files = [
{file = "cachetools-5.5.0-py3-none-any.whl", hash = "sha256:02134e8439cdc2ffb62023ce1debca2944c3f289d66bb17ead3ab3dede74b292"},
{file = "cachetools-5.5.0.tar.gz", hash = "sha256:2cc24fb4cbe39633fb7badd9db9ca6295d766d9c2995f245725a46715d050f2a"},
{file = "cachetools-5.5.1-py3-none-any.whl", hash = "sha256:b76651fdc3b24ead3c648bbdeeb940c1b04d365b38b4af66788f9ec4a81d42bb"},
{file = "cachetools-5.5.1.tar.gz", hash = "sha256:70f238fbba50383ef62e55c6aff6d9673175fe59f7c6782c7a0b9e38f4a9df95"},
]
[[package]]
name = "certifi"
version = "2024.12.14"
version = "2025.1.31"
description = "Python package for providing Mozilla's CA Bundle."
optional = false
python-versions = ">=3.6"
files = [
{file = "certifi-2024.12.14-py3-none-any.whl", hash = "sha256:1275f7a45be9464efc1173084eaa30f866fe2e47d389406136d332ed4967ec56"},
{file = "certifi-2024.12.14.tar.gz", hash = "sha256:b650d30f370c2b724812bee08008be0c4163b163ddaec3f2546c1caf65f191db"},
{file = "certifi-2025.1.31-py3-none-any.whl", hash = "sha256:ca78db4565a652026a4db2bcdf68f2fb589ea80d0be70e03929ed730746b84fe"},
{file = "certifi-2025.1.31.tar.gz", hash = "sha256:3d5da6925056f6f18f119200434a4780a94263f10d1c21d032a6f6b2baa20651"},
]
[[package]]
@@ -903,13 +903,13 @@ pycparser = "*"
[[package]]
name = "cfn-lint"
version = "1.22.5"
version = "1.23.1"
description = "Checks CloudFormation templates for practices and behaviour that could potentially be improved"
optional = false
python-versions = ">=3.8"
files = [
{file = "cfn_lint-1.22.5-py3-none-any.whl", hash = "sha256:18309e59cc03ff18b02676688df7eb1a17f5276da3776f31946fc0d9aa9b8fe7"},
{file = "cfn_lint-1.22.5.tar.gz", hash = "sha256:8b4f55e283143e99d8d331627637226c291cecfb936606f7aab2d940e71e566d"},
{file = "cfn_lint-1.23.1-py3-none-any.whl", hash = "sha256:6f89f557dea6484cd5bc1b32cef91e9898dd1d98f12d5b59a7f6baf9cf61b7ee"},
{file = "cfn_lint-1.23.1.tar.gz", hash = "sha256:2ee8722673414a3993921d87cc1893934d313b9b953da7a91442f81958d86644"},
]
[package.dependencies]
@@ -1281,20 +1281,20 @@ files = [
[[package]]
name = "deprecated"
version = "1.2.15"
version = "1.2.18"
description = "Python @deprecated decorator to deprecate old python classes, functions or methods."
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,>=2.7"
files = [
{file = "Deprecated-1.2.15-py2.py3-none-any.whl", hash = "sha256:353bc4a8ac4bfc96800ddab349d89c25dec1079f65fd53acdcc1e0b975b21320"},
{file = "deprecated-1.2.15.tar.gz", hash = "sha256:683e561a90de76239796e6b6feac66b99030d2dd3fcf61ef996330f14bbb9b0d"},
{file = "Deprecated-1.2.18-py2.py3-none-any.whl", hash = "sha256:bd5011788200372a32418f888e326a09ff80d0214bd961147cfed01b5c018eec"},
{file = "deprecated-1.2.18.tar.gz", hash = "sha256:422b6f6d859da6f2ef57857761bfb392480502a64c3028ca9bbe86085d72115d"},
]
[package.dependencies]
wrapt = ">=1.10,<2"
[package.extras]
dev = ["PyTest", "PyTest-Cov", "bump2version (<1)", "jinja2 (>=3.0.3,<3.1.0)", "setuptools", "sphinx (<2)", "tox"]
dev = ["PyTest", "PyTest-Cov", "bump2version (<1)", "setuptools", "tox"]
[[package]]
name = "detect-secrets"
@@ -1668,13 +1668,13 @@ test = ["coverage[toml]", "ddt (>=1.1.1,!=1.4.3)", "mock", "mypy", "pre-commit",
[[package]]
name = "google-api-core"
version = "2.24.0"
version = "2.24.1"
description = "Google API client core library"
optional = false
python-versions = ">=3.7"
files = [
{file = "google_api_core-2.24.0-py3-none-any.whl", hash = "sha256:10d82ac0fca69c82a25b3efdeefccf6f28e02ebb97925a8cce8edbfe379929d9"},
{file = "google_api_core-2.24.0.tar.gz", hash = "sha256:e255640547a597a4da010876d333208ddac417d60add22b6851a0c66a831fcaf"},
{file = "google_api_core-2.24.1-py3-none-any.whl", hash = "sha256:bc78d608f5a5bf853b80bd70a795f703294de656c096c0968320830a4bc280f1"},
{file = "google_api_core-2.24.1.tar.gz", hash = "sha256:f8b36f5456ab0dd99a1b693a40a31d1e7757beea380ad1b38faaf8941eae9d8a"},
]
[package.dependencies]
@@ -1692,13 +1692,13 @@ grpcio-gcp = ["grpcio-gcp (>=0.2.2,<1.0.dev0)"]
[[package]]
name = "google-api-python-client"
version = "2.159.0"
version = "2.160.0"
description = "Google API Client Library for Python"
optional = false
python-versions = ">=3.7"
files = [
{file = "google_api_python_client-2.159.0-py2.py3-none-any.whl", hash = "sha256:baef0bb631a60a0bd7c0bf12a5499e3a40cd4388484de7ee55c1950bf820a0cf"},
{file = "google_api_python_client-2.159.0.tar.gz", hash = "sha256:55197f430f25c907394b44fa078545ffef89d33fd4dca501b7db9f0d8e224bd6"},
{file = "google_api_python_client-2.160.0-py2.py3-none-any.whl", hash = "sha256:63d61fb3e4cf3fb31a70a87f45567c22f6dfe87bbfa27252317e3e2c42900db4"},
{file = "google_api_python_client-2.160.0.tar.gz", hash = "sha256:a8ccafaecfa42d15d5b5c3134ced8de08380019717fc9fb1ed510ca58eca3b7e"},
]
[package.dependencies]
@@ -1710,13 +1710,13 @@ uritemplate = ">=3.0.1,<5"
[[package]]
name = "google-auth"
version = "2.37.0"
version = "2.38.0"
description = "Google Authentication Library"
optional = false
python-versions = ">=3.7"
files = [
{file = "google_auth-2.37.0-py2.py3-none-any.whl", hash = "sha256:42664f18290a6be591be5329a96fe30184be1a1badb7292a7f686a9659de9ca0"},
{file = "google_auth-2.37.0.tar.gz", hash = "sha256:0054623abf1f9c83492c63d3f47e77f0a544caa3d40b2d98e099a611c2dd5d00"},
{file = "google_auth-2.38.0-py2.py3-none-any.whl", hash = "sha256:e7dae6694313f434a2727bf2906f27ad259bae090d7aa896590d86feec3d9d4a"},
{file = "google_auth-2.38.0.tar.gz", hash = "sha256:8285113607d3b80a3f1543b75962447ba8a09fe85783432a784fdeef6ac094c4"},
]
[package.dependencies]
@@ -1779,13 +1779,13 @@ test = ["pytest", "sphinx", "sphinx-autobuild", "twine", "wheel"]
[[package]]
name = "graphql-core"
version = "3.2.5"
version = "3.2.6"
description = "GraphQL implementation for Python, a port of GraphQL.js, the JavaScript reference implementation for GraphQL."
optional = false
python-versions = "<4,>=3.6"
files = [
{file = "graphql_core-3.2.5-py3-none-any.whl", hash = "sha256:2f150d5096448aa4f8ab26268567bbfeef823769893b39c1a2e1409590939c8a"},
{file = "graphql_core-3.2.5.tar.gz", hash = "sha256:e671b90ed653c808715645e3998b7ab67d382d55467b7e2978549111bbabf8d5"},
{file = "graphql_core-3.2.6-py3-none-any.whl", hash = "sha256:78b016718c161a6fb20a7d97bbf107f331cd1afe53e45566c59f776ed7f0b45f"},
{file = "graphql_core-3.2.6.tar.gz", hash = "sha256:c08eec22f9e40f0bd61d805907e3b3b1b9a320bc606e23dc145eebca07c8fbab"},
]
[package.dependencies]
@@ -1804,28 +1804,28 @@ files = [
[[package]]
name = "h2"
version = "4.1.0"
description = "HTTP/2 State-Machine based protocol implementation"
version = "4.2.0"
description = "Pure-Python HTTP/2 protocol implementation"
optional = false
python-versions = ">=3.6.1"
python-versions = ">=3.9"
files = [
{file = "h2-4.1.0-py3-none-any.whl", hash = "sha256:03a46bcf682256c95b5fd9e9a99c1323584c3eec6440d379b9903d709476bc6d"},
{file = "h2-4.1.0.tar.gz", hash = "sha256:a83aca08fbe7aacb79fec788c9c0bac936343560ed9ec18b82a13a12c28d2abb"},
{file = "h2-4.2.0-py3-none-any.whl", hash = "sha256:479a53ad425bb29af087f3458a61d30780bc818e4ebcf01f0b536ba916462ed0"},
{file = "h2-4.2.0.tar.gz", hash = "sha256:c8a52129695e88b1a0578d8d2cc6842bbd79128ac685463b887ee278126ad01f"},
]
[package.dependencies]
hpack = ">=4.0,<5"
hyperframe = ">=6.0,<7"
hpack = ">=4.1,<5"
hyperframe = ">=6.1,<7"
[[package]]
name = "hpack"
version = "4.0.0"
description = "Pure-Python HPACK header compression"
version = "4.1.0"
description = "Pure-Python HPACK header encoding"
optional = false
python-versions = ">=3.6.1"
python-versions = ">=3.9"
files = [
{file = "hpack-4.0.0-py3-none-any.whl", hash = "sha256:84a076fad3dc9a9f8063ccb8041ef100867b1878b25ef0ee63847a5d53818a6c"},
{file = "hpack-4.0.0.tar.gz", hash = "sha256:fc41de0c63e687ebffde81187a948221294896f6bdc0ae2312708df339430095"},
{file = "hpack-4.1.0-py3-none-any.whl", hash = "sha256:157ac792668d995c657d93111f46b4535ed114f0c9c8d672271bbec7eae1b496"},
{file = "hpack-4.1.0.tar.gz", hash = "sha256:ec5eca154f7056aa06f196a557655c5b009b382873ac8d1e66e79e87535f1dca"},
]
[[package]]
@@ -1890,13 +1890,13 @@ zstd = ["zstandard (>=0.18.0)"]
[[package]]
name = "hyperframe"
version = "6.0.1"
description = "HTTP/2 framing layer for Python"
version = "6.1.0"
description = "Pure-Python HTTP/2 framing"
optional = false
python-versions = ">=3.6.1"
python-versions = ">=3.9"
files = [
{file = "hyperframe-6.0.1-py3-none-any.whl", hash = "sha256:0ec6bafd80d8ad2195c4f03aacba3a8265e57bc4cff261e802bf39970ed02a15"},
{file = "hyperframe-6.0.1.tar.gz", hash = "sha256:ae510046231dc8e9ecb1a6586f63d2347bf4c8905914aa84ba585ae85f28a914"},
{file = "hyperframe-6.1.0-py3-none-any.whl", hash = "sha256:b03380493a519fce58ea5af42e4a42317bf9bd425596f7a0835ffce80f1a42e5"},
{file = "hyperframe-6.1.0.tar.gz", hash = "sha256:f630908a00854a7adeabd6382b43923a4c4cd4b821fcb527e6ab9e15382a3b08"},
]
[[package]]
@@ -2091,19 +2091,19 @@ format-nongpl = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-
[[package]]
name = "jsonschema-path"
version = "0.3.3"
version = "0.3.4"
description = "JSONSchema Spec with object-oriented paths"
optional = false
python-versions = "<4.0.0,>=3.8.0"
files = [
{file = "jsonschema_path-0.3.3-py3-none-any.whl", hash = "sha256:203aff257f8038cd3c67be614fe6b2001043408cb1b4e36576bc4921e09d83c4"},
{file = "jsonschema_path-0.3.3.tar.gz", hash = "sha256:f02e5481a4288ec062f8e68c808569e427d905bedfecb7f2e4c69ef77957c382"},
{file = "jsonschema_path-0.3.4-py3-none-any.whl", hash = "sha256:f502191fdc2b22050f9a81c9237be9d27145b9001c55842bece5e94e382e52f8"},
{file = "jsonschema_path-0.3.4.tar.gz", hash = "sha256:8365356039f16cc65fddffafda5f58766e34bebab7d6d105616ab52bc4297001"},
]
[package.dependencies]
pathable = ">=0.4.1,<0.5.0"
PyYAML = ">=5.1"
referencing = ">=0.28.0,<0.36.0"
referencing = "<0.37.0"
requests = ">=2.31.0,<3.0.0"
[[package]]
@@ -2122,13 +2122,13 @@ referencing = ">=0.31.0"
[[package]]
name = "kubernetes"
version = "31.0.0"
version = "32.0.0"
description = "Kubernetes python client"
optional = false
python-versions = ">=3.6"
files = [
{file = "kubernetes-31.0.0-py2.py3-none-any.whl", hash = "sha256:bf141e2d380c8520eada8b351f4e319ffee9636328c137aa432bc486ca1200e1"},
{file = "kubernetes-31.0.0.tar.gz", hash = "sha256:28945de906c8c259c1ebe62703b56a03b714049372196f854105afe4e6d014c0"},
{file = "kubernetes-32.0.0-py2.py3-none-any.whl", hash = "sha256:60fd8c29e8e43d9c553ca4811895a687426717deba9c0a66fb2dcc3f5ef96692"},
{file = "kubernetes-32.0.0.tar.gz", hash = "sha256:319fa840345a482001ac5d6062222daeb66ec4d1bcb3087402aed685adf0aecb"},
]
[package.dependencies]
@@ -2307,13 +2307,13 @@ files = [
[[package]]
name = "marshmallow"
version = "3.25.1"
version = "3.26.1"
description = "A lightweight library for converting complex datatypes to and from native Python datatypes."
optional = false
python-versions = ">=3.9"
files = [
{file = "marshmallow-3.25.1-py3-none-any.whl", hash = "sha256:ec5d00d873ce473b7f2ffcb7104286a376c354cab0c2fa12f5573dab03e87210"},
{file = "marshmallow-3.25.1.tar.gz", hash = "sha256:f4debda3bb11153d81ac34b0d582bf23053055ee11e791b54b4b35493468040a"},
{file = "marshmallow-3.26.1-py3-none-any.whl", hash = "sha256:3350409f20a70a7e4e11a27661187b77cdcaeb20abca41c1454fe33636bea09c"},
{file = "marshmallow-3.26.1.tar.gz", hash = "sha256:e6d8affb6cb61d39d26402096dc0aee12d5a26d490a121f118d2e81dc0719dc6"},
]
[package.dependencies]
@@ -2359,13 +2359,13 @@ files = [
[[package]]
name = "microsoft-kiota-abstractions"
version = "1.6.8"
version = "1.9.2"
description = "Core abstractions for kiota generated libraries in Python"
optional = false
python-versions = "<4.0,>=3.8"
python-versions = "<4.0,>=3.9"
files = [
{file = "microsoft_kiota_abstractions-1.6.8-py3-none-any.whl", hash = "sha256:12819dee24d5aaa31e99683d938f65e50cbc446de087df244cd26c3326ec4e15"},
{file = "microsoft_kiota_abstractions-1.6.8.tar.gz", hash = "sha256:7070affabfa7182841646a0c8491cbb240af366aff2b9132f0caa45c4837dd78"},
{file = "microsoft_kiota_abstractions-1.9.2-py3-none-any.whl", hash = "sha256:a8853d272a84da59d6a2fe11a76c28e9c55bdab268a345ba48e918cb6822b607"},
{file = "microsoft_kiota_abstractions-1.9.2.tar.gz", hash = "sha256:29cdafe8d0672f23099556e0b120dca6231c752cca9393e1e0092fa9ca594572"},
]
[package.dependencies]
@@ -2375,19 +2375,19 @@ std-uritemplate = ">=2.0.0"
[[package]]
name = "microsoft-kiota-authentication-azure"
version = "1.6.8"
version = "1.9.1"
description = "Core abstractions for kiota generated libraries in Python"
optional = false
python-versions = "<4.0,>=3.8"
python-versions = "<4.0,>=3.9"
files = [
{file = "microsoft_kiota_authentication_azure-1.6.8-py3-none-any.whl", hash = "sha256:50455789b7133e27fbccec839d93e40d2637d18593a93921ae1338880c5b5b3b"},
{file = "microsoft_kiota_authentication_azure-1.6.8.tar.gz", hash = "sha256:fef23f43cd4d3b9ef839c8b3d1f675ec4a1120c150f963d8c4551c5e19ac3b36"},
{file = "microsoft_kiota_authentication_azure-1.9.1-py3-none-any.whl", hash = "sha256:3a3030b01e0cbf007736ec6548ac483742e04ad0d20979b49e293a683f839fc6"},
{file = "microsoft_kiota_authentication_azure-1.9.1.tar.gz", hash = "sha256:8ba31b1ecf78777128daf3191c7ecd28bfc2d10d0fe80260dd5c07a9ccd5d7f5"},
]
[package.dependencies]
aiohttp = ">=3.8.0"
azure-core = ">=1.21.1"
microsoft-kiota-abstractions = ">=1.6.8,<1.7.0"
microsoft-kiota-abstractions = ">=1.9.1,<1.10.0"
opentelemetry-api = ">=1.27.0"
opentelemetry-sdk = ">=1.27.0"
@@ -2408,83 +2408,61 @@ microsoft-kiota_abstractions = ">=1.0.0,<2.0.0"
opentelemetry-api = ">=1.20.0"
opentelemetry-sdk = ">=1.20.0"
[[package]]
name = "microsoft-kiota-http"
version = "1.6.8"
description = "Core abstractions for kiota generated libraries in Python"
optional = false
python-versions = "<4.0,>=3.8"
files = [
{file = "microsoft_kiota_http-1.6.8-py3-none-any.whl", hash = "sha256:7ff76a308351d885453185d6a6538c47a64ebdc7661cce46a904e89e2ceb9a1d"},
{file = "microsoft_kiota_http-1.6.8.tar.gz", hash = "sha256:67242690b79a30c0cadf823675249269e4bc020283e3d65b33af7d771df64df8"},
]
[package.dependencies]
httpx = {version = ">=0.28", extras = ["http2"]}
microsoft-kiota-abstractions = ">=1.6.8,<1.7.0"
opentelemetry-api = ">=1.27.0"
opentelemetry-sdk = ">=1.27.0"
urllib3 = ">=2.2.2,<3.0.0"
[[package]]
name = "microsoft-kiota-serialization-form"
version = "1.6.8"
version = "1.9.1"
description = "Core abstractions for kiota generated libraries in Python"
optional = false
python-versions = "<4.0,>=3.8"
python-versions = "<4.0,>=3.9"
files = [
{file = "microsoft_kiota_serialization_form-1.6.8-py3-none-any.whl", hash = "sha256:ca7dd19e173aa87c68b38c5056cc0921570c8c86f3ba5511d1616cf97e2f0f67"},
{file = "microsoft_kiota_serialization_form-1.6.8.tar.gz", hash = "sha256:bb9eb98b3abf596b4bfe208014dff948361ff48a757316ac58e19c31ab8d640a"},
{file = "microsoft_kiota_serialization_form-1.9.1-py3-none-any.whl", hash = "sha256:d86c0dc08b51288f2851a265dde729b200998bca1dd1681bbebcb27cd274ba34"},
{file = "microsoft_kiota_serialization_form-1.9.1.tar.gz", hash = "sha256:58b81eca5e0ad66bcbd6a4b65ba91e6255d3510f4c4f576fb4f5e83ca9a310c3"},
]
[package.dependencies]
microsoft-kiota-abstractions = ">=1.6.8,<1.7.0"
pendulum = ">=3.0.0b1"
microsoft-kiota-abstractions = ">=1.9.1,<1.10.0"
[[package]]
name = "microsoft-kiota-serialization-json"
version = "1.6.8"
version = "1.9.1"
description = "Core abstractions for kiota generated libraries in Python"
optional = false
python-versions = "<4.0,>=3.8"
python-versions = "<4.0,>=3.9"
files = [
{file = "microsoft_kiota_serialization_json-1.6.8-py3-none-any.whl", hash = "sha256:2734c2ad64cc089441279e4962f6fedf41af040730f6eab1533890cd5377aff5"},
{file = "microsoft_kiota_serialization_json-1.6.8.tar.gz", hash = "sha256:89e2dd0eb4eaaa6ab74fa89ab5d84c5a53464e73b85eb7085f0aa4560a2b8183"},
{file = "microsoft_kiota_serialization_json-1.9.1-py3-none-any.whl", hash = "sha256:ab752bf642a77266713bac3942a4c547dde60b917f674a9ab63261490fecf841"},
{file = "microsoft_kiota_serialization_json-1.9.1.tar.gz", hash = "sha256:0039458b885875daf246f1e8c0296551695e0ec70f3e4689b00b270ce923c0cc"},
]
[package.dependencies]
microsoft-kiota-abstractions = ">=1.6.8,<1.7.0"
pendulum = ">=3.0.0b1"
microsoft-kiota-abstractions = ">=1.9.1,<1.10.0"
[[package]]
name = "microsoft-kiota-serialization-multipart"
version = "1.6.8"
version = "1.9.1"
description = "Core abstractions for kiota generated libraries in Python"
optional = false
python-versions = "<4.0,>=3.8"
python-versions = "<4.0,>=3.9"
files = [
{file = "microsoft_kiota_serialization_multipart-1.6.8-py3-none-any.whl", hash = "sha256:1ecdd15dd1f78aed031d7d1828b6fbc00c633542d863c23f96fdd0a61bfb189a"},
{file = "microsoft_kiota_serialization_multipart-1.6.8.tar.gz", hash = "sha256:3d95c6d7186588af7a1d3aa852ce42077f80487b8b3c60e36fe109a8b4918c03"},
{file = "microsoft_kiota_serialization_multipart-1.9.1-py3-none-any.whl", hash = "sha256:1ee69d8e3b2c0d24431b85fc1628534f97dcafed45dcb6bb3f0143824ce8a665"},
{file = "microsoft_kiota_serialization_multipart-1.9.1.tar.gz", hash = "sha256:288790a486aad33aac0a7329f05b8851e9be82f50cc9a8f19fe6f267a4f2b183"},
]
[package.dependencies]
microsoft-kiota-abstractions = ">=1.6.8,<1.7.0"
pendulum = ">=3.0.0b1"
microsoft-kiota-abstractions = ">=1.9.1,<1.10.0"
[[package]]
name = "microsoft-kiota-serialization-text"
version = "1.6.8"
version = "1.9.1"
description = "Core abstractions for kiota generated libraries in Python"
optional = false
python-versions = "<4.0,>=3.8"
python-versions = "<4.0,>=3.9"
files = [
{file = "microsoft_kiota_serialization_text-1.6.8-py3-none-any.whl", hash = "sha256:4e5e287a614d362f864b5061dca0861c3f70b8792ec72967d1bff23944da1e80"},
{file = "microsoft_kiota_serialization_text-1.6.8.tar.gz", hash = "sha256:687d4858337eaf4f351b12ed1c6c934d869560f54ee3855bfdde589660e07208"},
{file = "microsoft_kiota_serialization_text-1.9.1-py3-none-any.whl", hash = "sha256:021fbfad887f7525f9e1c8bbe225d0aa1b25befe54357ae0f739f47576d946ff"},
{file = "microsoft_kiota_serialization_text-1.9.1.tar.gz", hash = "sha256:b4b1f61c2388edd704ab33a1792f92f3507db9648d389197a625e152b52743ab"},
]
[package.dependencies]
microsoft-kiota-abstractions = ">=1.6.8,<1.7.0"
python-dateutil = "2.9.0.post0"
microsoft-kiota-abstractions = ">=1.9.1,<1.10.0"
[[package]]
name = "mkdocs"
@@ -2558,13 +2536,13 @@ dev = ["click", "codecov", "mkdocs-gen-files", "mkdocs-git-authors-plugin", "mkd
[[package]]
name = "mkdocs-material"
version = "9.5.50"
version = "9.6.3"
description = "Documentation that simply works"
optional = false
python-versions = ">=3.8"
files = [
{file = "mkdocs_material-9.5.50-py3-none-any.whl", hash = "sha256:f24100f234741f4d423a9d672a909d859668a4f404796be3cf035f10d6050385"},
{file = "mkdocs_material-9.5.50.tar.gz", hash = "sha256:ae5fe16f3d7c9ccd05bb6916a7da7420cf99a9ce5e33debd9d40403a090d5825"},
{file = "mkdocs_material-9.6.3-py3-none-any.whl", hash = "sha256:1125622067e26940806701219303b27c0933e04533560725d97ec26fd16a39cf"},
{file = "mkdocs_material-9.6.3.tar.gz", hash = "sha256:c87f7d1c39ce6326da5e10e232aed51bae46252e646755900f4b0fc9192fa832"},
]
[package.dependencies]
@@ -2614,13 +2592,13 @@ test = ["pytest", "pytest-cov"]
[[package]]
name = "moto"
version = "5.0.27"
version = "5.0.28"
description = "A library that allows you to easily mock out tests based on AWS infrastructure"
optional = false
python-versions = ">=3.8"
files = [
{file = "moto-5.0.27-py3-none-any.whl", hash = "sha256:27042fd94c8def0166d9f2ae8d39d9488d4b3115542b5fca88566c0424549013"},
{file = "moto-5.0.27.tar.gz", hash = "sha256:6c123de7e0e5e6508a10c399ba3ecf2d5143f263f8e804fd4a7091941c3f5207"},
{file = "moto-5.0.28-py3-none-any.whl", hash = "sha256:2dfbea1afe3b593e13192059a1a7fc4b3cf7fdf92e432070c22346efa45aa0f0"},
{file = "moto-5.0.28.tar.gz", hash = "sha256:4d3437693411ec943c13c77de5b0b520c4b0a9ac850fead4ba2a54709e086e8b"},
]
[package.dependencies]
@@ -2724,13 +2702,13 @@ portalocker = ">=1.4,<3"
[[package]]
name = "msgraph-core"
version = "1.2.0"
version = "1.2.1"
description = "Core component of the Microsoft Graph Python SDK"
optional = false
python-versions = ">=3.9"
files = [
{file = "msgraph_core-1.2.0-py3-none-any.whl", hash = "sha256:4ce14bbe743c0f2dd8b53c7fcd338fc14081e0df6b14021f0d0e4bb63cd5a840"},
{file = "msgraph_core-1.2.0.tar.gz", hash = "sha256:a4e42f692e664c60d63359e610bbf990f57b42d8080417261ff7042bbd59c98b"},
{file = "msgraph_core-1.2.1-py3-none-any.whl", hash = "sha256:4591c1dc4359a323a50b2d29e4cb75ecaff32b4b6781f1ae743a655fb4a20ad7"},
{file = "msgraph_core-1.2.1.tar.gz", hash = "sha256:87a3cb4d36dad590a3f02aaedf422547cbac10460bd9f0b6c984fab9556150d3"},
]
[package.dependencies]
@@ -2744,13 +2722,13 @@ dev = ["bumpver", "isort", "mypy", "pylint", "pytest", "yapf"]
[[package]]
name = "msgraph-sdk"
version = "1.17.0"
version = "1.18.0"
description = "The Microsoft Graph Python SDK"
optional = false
python-versions = ">=3.9"
files = [
{file = "msgraph_sdk-1.17.0-py3-none-any.whl", hash = "sha256:5582a258ded19a486ab407a67b5f65d666758a63864da77bd20c2581d1c00fba"},
{file = "msgraph_sdk-1.17.0.tar.gz", hash = "sha256:577e41942b0f794b8cf2f54db030bc039a750a81b515dcd0ba1d66fd961fa7bf"},
{file = "msgraph_sdk-1.18.0-py3-none-any.whl", hash = "sha256:f09b015bb9d7690bc6f30c9a28f9a414107aaf06be4952c27b3653dcdf33f2a3"},
{file = "msgraph_sdk-1.18.0.tar.gz", hash = "sha256:ef49166ada7b459b5a843ceb3d253c1ab99d8987ebf3112147eb6cbcaa101793"},
]
[package.dependencies]
@@ -2918,6 +2896,32 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
[[package]]
name = "narwhals"
version = "1.25.0"
description = "Extremely lightweight compatibility layer between dataframe libraries"
optional = false
python-versions = ">=3.8"
files = [
{file = "narwhals-1.25.0-py3-none-any.whl", hash = "sha256:73c595179c95f1ebe87f2f8487b090223cde23409710a0f48a0f987c1f4ff775"},
{file = "narwhals-1.25.0.tar.gz", hash = "sha256:bce0f713831106dde3c7f66b610a104c9d0b7fba7739e8056f29f0aa831891ea"},
]
[package.extras]
core = ["duckdb", "pandas", "polars", "pyarrow", "pyarrow-stubs"]
cudf = ["cudf (>=24.10.0)"]
dask = ["dask[dataframe] (>=2024.8)"]
dev = ["covdefaults", "hypothesis", "pre-commit", "pytest", "pytest-cov", "pytest-env", "pytest-randomly", "typing-extensions"]
docs = ["black", "duckdb", "jinja2", "markdown-exec[ansi]", "mkdocs", "mkdocs-autorefs", "mkdocs-material", "mkdocstrings[python]", "pandas", "polars (>=1.0.0)", "pyarrow"]
duckdb = ["duckdb (>=1.0)"]
extra = ["scikit-learn"]
ibis = ["ibis-framework (>=6.0.0)", "packaging", "pyarrow-hotfix", "rich"]
modin = ["modin"]
pandas = ["pandas (>=0.25.3)"]
polars = ["polars (>=0.20.3)"]
pyarrow = ["pyarrow (>=11.0.0)"]
pyspark = ["pyspark (>=3.5.0)"]
[[package]]
name = "nest-asyncio"
version = "1.6.0"
@@ -3232,113 +3236,17 @@ files = [
[[package]]
name = "pbr"
version = "6.1.0"
version = "6.1.1"
description = "Python Build Reasonableness"
optional = false
python-versions = ">=2.6"
files = [
{file = "pbr-6.1.0-py2.py3-none-any.whl", hash = "sha256:a776ae228892d8013649c0aeccbb3d5f99ee15e005a4cbb7e61d55a067b28a2a"},
{file = "pbr-6.1.0.tar.gz", hash = "sha256:788183e382e3d1d7707db08978239965e8b9e4e5ed42669bf4758186734d5f24"},
]
[[package]]
name = "pendulum"
version = "3.0.0"
description = "Python datetimes made easy"
optional = false
python-versions = ">=3.8"
files = [
{file = "pendulum-3.0.0-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2cf9e53ef11668e07f73190c805dbdf07a1939c3298b78d5a9203a86775d1bfd"},
{file = "pendulum-3.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:fb551b9b5e6059377889d2d878d940fd0bbb80ae4810543db18e6f77b02c5ef6"},
{file = "pendulum-3.0.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6c58227ac260d5b01fc1025176d7b31858c9f62595737f350d22124a9a3ad82d"},
{file = "pendulum-3.0.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:60fb6f415fea93a11c52578eaa10594568a6716602be8430b167eb0d730f3332"},
{file = "pendulum-3.0.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b69f6b4dbcb86f2c2fe696ba991e67347bcf87fe601362a1aba6431454b46bde"},
{file = "pendulum-3.0.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:138afa9c373ee450ede206db5a5e9004fd3011b3c6bbe1e57015395cd076a09f"},
{file = "pendulum-3.0.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:83d9031f39c6da9677164241fd0d37fbfc9dc8ade7043b5d6d62f56e81af8ad2"},
{file = "pendulum-3.0.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:0c2308af4033fa534f089595bcd40a95a39988ce4059ccd3dc6acb9ef14ca44a"},
{file = "pendulum-3.0.0-cp310-none-win_amd64.whl", hash = "sha256:9a59637cdb8462bdf2dbcb9d389518c0263799189d773ad5c11db6b13064fa79"},
{file = "pendulum-3.0.0-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:3725245c0352c95d6ca297193192020d1b0c0f83d5ee6bb09964edc2b5a2d508"},
{file = "pendulum-3.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:6c035f03a3e565ed132927e2c1b691de0dbf4eb53b02a5a3c5a97e1a64e17bec"},
{file = "pendulum-3.0.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:597e66e63cbd68dd6d58ac46cb7a92363d2088d37ccde2dae4332ef23e95cd00"},
{file = "pendulum-3.0.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:99a0f8172e19f3f0c0e4ace0ad1595134d5243cf75985dc2233e8f9e8de263ca"},
{file = "pendulum-3.0.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:77d8839e20f54706aed425bec82a83b4aec74db07f26acd039905d1237a5e1d4"},
{file = "pendulum-3.0.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:afde30e8146292b059020fbc8b6f8fd4a60ae7c5e6f0afef937bbb24880bdf01"},
{file = "pendulum-3.0.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:660434a6fcf6303c4efd36713ca9212c753140107ee169a3fc6c49c4711c2a05"},
{file = "pendulum-3.0.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:dee9e5a48c6999dc1106eb7eea3e3a50e98a50651b72c08a87ee2154e544b33e"},
{file = "pendulum-3.0.0-cp311-none-win_amd64.whl", hash = "sha256:d4cdecde90aec2d67cebe4042fd2a87a4441cc02152ed7ed8fb3ebb110b94ec4"},
{file = "pendulum-3.0.0-cp311-none-win_arm64.whl", hash = "sha256:773c3bc4ddda2dda9f1b9d51fe06762f9200f3293d75c4660c19b2614b991d83"},
{file = "pendulum-3.0.0-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:409e64e41418c49f973d43a28afe5df1df4f1dd87c41c7c90f1a63f61ae0f1f7"},
{file = "pendulum-3.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:a38ad2121c5ec7c4c190c7334e789c3b4624798859156b138fcc4d92295835dc"},
{file = "pendulum-3.0.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fde4d0b2024b9785f66b7f30ed59281bd60d63d9213cda0eb0910ead777f6d37"},
{file = "pendulum-3.0.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4b2c5675769fb6d4c11238132962939b960fcb365436b6d623c5864287faa319"},
{file = "pendulum-3.0.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8af95e03e066826f0f4c65811cbee1b3123d4a45a1c3a2b4fc23c4b0dff893b5"},
{file = "pendulum-3.0.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2165a8f33cb15e06c67070b8afc87a62b85c5a273e3aaa6bc9d15c93a4920d6f"},
{file = "pendulum-3.0.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:ad5e65b874b5e56bd942546ea7ba9dd1d6a25121db1c517700f1c9de91b28518"},
{file = "pendulum-3.0.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:17fe4b2c844bbf5f0ece69cfd959fa02957c61317b2161763950d88fed8e13b9"},
{file = "pendulum-3.0.0-cp312-none-win_amd64.whl", hash = "sha256:78f8f4e7efe5066aca24a7a57511b9c2119f5c2b5eb81c46ff9222ce11e0a7a5"},
{file = "pendulum-3.0.0-cp312-none-win_arm64.whl", hash = "sha256:28f49d8d1e32aae9c284a90b6bb3873eee15ec6e1d9042edd611b22a94ac462f"},
{file = "pendulum-3.0.0-cp37-cp37m-macosx_10_12_x86_64.whl", hash = "sha256:d4e2512f4e1a4670284a153b214db9719eb5d14ac55ada5b76cbdb8c5c00399d"},
{file = "pendulum-3.0.0-cp37-cp37m-macosx_11_0_arm64.whl", hash = "sha256:3d897eb50883cc58d9b92f6405245f84b9286cd2de6e8694cb9ea5cb15195a32"},
{file = "pendulum-3.0.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2e169cc2ca419517f397811bbe4589cf3cd13fca6dc38bb352ba15ea90739ebb"},
{file = "pendulum-3.0.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f17c3084a4524ebefd9255513692f7e7360e23c8853dc6f10c64cc184e1217ab"},
{file = "pendulum-3.0.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:826d6e258052715f64d05ae0fc9040c0151e6a87aae7c109ba9a0ed930ce4000"},
{file = "pendulum-3.0.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2aae97087872ef152a0c40e06100b3665d8cb86b59bc8471ca7c26132fccd0f"},
{file = "pendulum-3.0.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:ac65eeec2250d03106b5e81284ad47f0d417ca299a45e89ccc69e36130ca8bc7"},
{file = "pendulum-3.0.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:a5346d08f3f4a6e9e672187faa179c7bf9227897081d7121866358af369f44f9"},
{file = "pendulum-3.0.0-cp37-none-win_amd64.whl", hash = "sha256:235d64e87946d8f95c796af34818c76e0f88c94d624c268693c85b723b698aa9"},
{file = "pendulum-3.0.0-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:6a881d9c2a7f85bc9adafcfe671df5207f51f5715ae61f5d838b77a1356e8b7b"},
{file = "pendulum-3.0.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:d7762d2076b9b1cb718a6631ad6c16c23fc3fac76cbb8c454e81e80be98daa34"},
{file = "pendulum-3.0.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e8e36a8130819d97a479a0e7bf379b66b3b1b520e5dc46bd7eb14634338df8c"},
{file = "pendulum-3.0.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7dc843253ac373358ffc0711960e2dd5b94ab67530a3e204d85c6e8cb2c5fa10"},
{file = "pendulum-3.0.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0a78ad3635d609ceb1e97d6aedef6a6a6f93433ddb2312888e668365908c7120"},
{file = "pendulum-3.0.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b30a137e9e0d1f751e60e67d11fc67781a572db76b2296f7b4d44554761049d6"},
{file = "pendulum-3.0.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:c95984037987f4a457bb760455d9ca80467be792236b69d0084f228a8ada0162"},
{file = "pendulum-3.0.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:d29c6e578fe0f893766c0d286adbf0b3c726a4e2341eba0917ec79c50274ec16"},
{file = "pendulum-3.0.0-cp38-none-win_amd64.whl", hash = "sha256:deaba8e16dbfcb3d7a6b5fabdd5a38b7c982809567479987b9c89572df62e027"},
{file = "pendulum-3.0.0-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:b11aceea5b20b4b5382962b321dbc354af0defe35daa84e9ff3aae3c230df694"},
{file = "pendulum-3.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a90d4d504e82ad236afac9adca4d6a19e4865f717034fc69bafb112c320dcc8f"},
{file = "pendulum-3.0.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:825799c6b66e3734227756fa746cc34b3549c48693325b8b9f823cb7d21b19ac"},
{file = "pendulum-3.0.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad769e98dc07972e24afe0cff8d365cb6f0ebc7e65620aa1976fcfbcadc4c6f3"},
{file = "pendulum-3.0.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a6fc26907eb5fb8cc6188cc620bc2075a6c534d981a2f045daa5f79dfe50d512"},
{file = "pendulum-3.0.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0c717eab1b6d898c00a3e0fa7781d615b5c5136bbd40abe82be100bb06df7a56"},
{file = "pendulum-3.0.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:3ddd1d66d1a714ce43acfe337190be055cdc221d911fc886d5a3aae28e14b76d"},
{file = "pendulum-3.0.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:822172853d7a9cf6da95d7b66a16c7160cb99ae6df55d44373888181d7a06edc"},
{file = "pendulum-3.0.0-cp39-none-win_amd64.whl", hash = "sha256:840de1b49cf1ec54c225a2a6f4f0784d50bd47f68e41dc005b7f67c7d5b5f3ae"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:3b1f74d1e6ffe5d01d6023870e2ce5c2191486928823196f8575dcc786e107b1"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:729e9f93756a2cdfa77d0fc82068346e9731c7e884097160603872686e570f07"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e586acc0b450cd21cbf0db6bae386237011b75260a3adceddc4be15334689a9a"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:22e7944ffc1f0099a79ff468ee9630c73f8c7835cd76fdb57ef7320e6a409df4"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:fa30af36bd8e50686846bdace37cf6707bdd044e5cb6e1109acbad3277232e04"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:440215347b11914ae707981b9a57ab9c7b6983ab0babde07063c6ee75c0dc6e7"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:314c4038dc5e6a52991570f50edb2f08c339debdf8cea68ac355b32c4174e820"},
{file = "pendulum-3.0.0-pp37-pypy37_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5acb1d386337415f74f4d1955c4ce8d0201978c162927d07df8eb0692b2d8533"},
{file = "pendulum-3.0.0-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a789e12fbdefaffb7b8ac67f9d8f22ba17a3050ceaaa635cd1cc4645773a4b1e"},
{file = "pendulum-3.0.0-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:860aa9b8a888e5913bd70d819306749e5eb488e6b99cd6c47beb701b22bdecf5"},
{file = "pendulum-3.0.0-pp37-pypy37_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:5ebc65ea033ef0281368217fbf59f5cb05b338ac4dd23d60959c7afcd79a60a0"},
{file = "pendulum-3.0.0-pp37-pypy37_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:d9fef18ab0386ef6a9ac7bad7e43ded42c83ff7ad412f950633854f90d59afa8"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-macosx_10_12_x86_64.whl", hash = "sha256:1c134ba2f0571d0b68b83f6972e2307a55a5a849e7dac8505c715c531d2a8795"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:385680812e7e18af200bb9b4a49777418c32422d05ad5a8eb85144c4a285907b"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9eec91cd87c59fb32ec49eb722f375bd58f4be790cae11c1b70fac3ee4f00da0"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4386bffeca23c4b69ad50a36211f75b35a4deb6210bdca112ac3043deb7e494a"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:dfbcf1661d7146d7698da4b86e7f04814221081e9fe154183e34f4c5f5fa3bf8"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:04a1094a5aa1daa34a6b57c865b25f691848c61583fb22722a4df5699f6bf74c"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:5b0ec85b9045bd49dd3a3493a5e7ddfd31c36a2a60da387c419fa04abcaecb23"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:0a15b90129765b705eb2039062a6daf4d22c4e28d1a54fa260892e8c3ae6e157"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:bb8f6d7acd67a67d6fedd361ad2958ff0539445ef51cbe8cd288db4306503cd0"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fd69b15374bef7e4b4440612915315cc42e8575fcda2a3d7586a0d88192d0c88"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dc00f8110db6898360c53c812872662e077eaf9c75515d53ecc65d886eec209a"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:83a44e8b40655d0ba565a5c3d1365d27e3e6778ae2a05b69124db9e471255c4a"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:1a3604e9fbc06b788041b2a8b78f75c243021e0f512447806a6d37ee5214905d"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:92c307ae7accebd06cbae4729f0ba9fa724df5f7d91a0964b1b972a22baa482b"},
{file = "pendulum-3.0.0.tar.gz", hash = "sha256:5d034998dea404ec31fae27af6b22cff1708f830a1ed7353be4d1019bb9f584e"},
{file = "pbr-6.1.1-py2.py3-none-any.whl", hash = "sha256:38d4daea5d9fa63b3f626131b9d34947fd0c8be9b05a29276870580050a25a76"},
{file = "pbr-6.1.1.tar.gz", hash = "sha256:93ea72ce6989eb2eed99d0f75721474f69ad88128afdef5ac377eb797c4bf76b"},
]
[package.dependencies]
python-dateutil = ">=2.6"
tzdata = ">=2020.1"
[package.extras]
test = ["time-machine (>=2.6.0)"]
setuptools = "*"
[[package]]
name = "platformdirs"
@@ -3358,18 +3266,21 @@ type = ["mypy (>=1.11.2)"]
[[package]]
name = "plotly"
version = "5.24.1"
version = "6.0.0"
description = "An open-source, interactive data visualization library for Python"
optional = false
python-versions = ">=3.8"
files = [
{file = "plotly-5.24.1-py3-none-any.whl", hash = "sha256:f67073a1e637eb0dc3e46324d9d51e2fe76e9727c892dde64ddf1e1b51f29089"},
{file = "plotly-5.24.1.tar.gz", hash = "sha256:dbc8ac8339d248a4bcc36e08a5659bacfe1b079390b8953533f4eb22169b4bae"},
{file = "plotly-6.0.0-py3-none-any.whl", hash = "sha256:f708871c3a9349a68791ff943a5781b1ec04de7769ea69068adcd9202e57653a"},
{file = "plotly-6.0.0.tar.gz", hash = "sha256:c4aad38b8c3d65e4a5e7dd308b084143b9025c2cc9d5317fc1f1d30958db87d3"},
]
[package.dependencies]
narwhals = ">=1.15.1"
packaging = "*"
tenacity = ">=6.2.0"
[package.extras]
express = ["numpy"]
[[package]]
name = "pluggy"
@@ -3509,13 +3420,13 @@ files = [
[[package]]
name = "proto-plus"
version = "1.25.0"
description = "Beautiful, Pythonic protocol buffers."
version = "1.26.0"
description = "Beautiful, Pythonic protocol buffers"
optional = false
python-versions = ">=3.7"
files = [
{file = "proto_plus-1.25.0-py3-none-any.whl", hash = "sha256:c91fc4a65074ade8e458e95ef8bac34d4008daa7cce4a12d6707066fca648961"},
{file = "proto_plus-1.25.0.tar.gz", hash = "sha256:fbb17f57f7bd05a68b7707e745e26528b0b3c34e378db91eef93912c54982d91"},
{file = "proto_plus-1.26.0-py3-none-any.whl", hash = "sha256:bf2dfaa3da281fc3187d12d224c707cb57214fb2c22ba854eb0c105a3fb2d4d7"},
{file = "proto_plus-1.26.0.tar.gz", hash = "sha256:6e93d5f5ca267b54300880fff156b6a3386b3fa3f43b1da62e680fc0c586ef22"},
]
[package.dependencies]
@@ -3575,13 +3486,13 @@ test = ["enum34", "ipaddress", "mock", "pywin32", "wmi"]
[[package]]
name = "py-ocsf-models"
version = "0.2.0"
version = "0.3.0"
description = "This is a Python implementation of the OCSF models. The models are used to represent the data of the OCSF Schema defined in https://schema.ocsf.io/."
optional = false
python-versions = "<3.13,>=3.9"
files = [
{file = "py_ocsf_models-0.2.0-py3-none-any.whl", hash = "sha256:ac75fd21077694b343ebaad3479194db113c274879b114277560ff287d5cd7b5"},
{file = "py_ocsf_models-0.2.0.tar.gz", hash = "sha256:3e12648d05329e6776a0e6b1ffea87a3eb60aa7d8cb2c4afd69e5724f443ce03"},
{file = "py_ocsf_models-0.3.0-py3-none-any.whl", hash = "sha256:3d31e379be5e4271f7faf62dee9c36798559a1f7f98dff142c0e4cfdb35e291c"},
{file = "py_ocsf_models-0.3.0.tar.gz", hash = "sha256:ad46b7d9761b74010f06a894df2d9541989252b7ff738cd5c7edbf4283df2279"},
]
[package.dependencies]
@@ -3756,13 +3667,13 @@ tests = ["coverage[toml] (==5.0.4)", "pytest (>=6.0.0,<7.0.0)"]
[[package]]
name = "pylint"
version = "3.3.3"
version = "3.3.4"
description = "python code static checker"
optional = false
python-versions = ">=3.9.0"
files = [
{file = "pylint-3.3.3-py3-none-any.whl", hash = "sha256:26e271a2bc8bce0fc23833805a9076dd9b4d5194e2a02164942cb3cdc37b4183"},
{file = "pylint-3.3.3.tar.gz", hash = "sha256:07c607523b17e6d16e2ae0d7ef59602e332caa762af64203c24b41c27139f36a"},
{file = "pylint-3.3.4-py3-none-any.whl", hash = "sha256:289e6a1eb27b453b08436478391a48cd53bb0efb824873f949e709350f3de018"},
{file = "pylint-3.3.4.tar.gz", hash = "sha256:74ae7a38b177e69a9b525d0794bd8183820bfa7eb68cc1bee6e8ed22a42be4ce"},
]
[package.dependencies]
@@ -3773,7 +3684,7 @@ dill = [
{version = ">=0.3.7", markers = "python_version >= \"3.12\""},
{version = ">=0.3.6", markers = "python_version >= \"3.11\" and python_version < \"3.12\""},
]
isort = ">=4.2.5,<5.13.0 || >5.13.0,<6"
isort = ">=4.2.5,<5.13.0 || >5.13.0,<7"
mccabe = ">=0.6,<0.8"
platformdirs = ">=2.2.0"
tomli = {version = ">=1.1.0", markers = "python_version < \"3.11\""}
@@ -3786,13 +3697,13 @@ testutils = ["gitpython (>3)"]
[[package]]
name = "pymdown-extensions"
version = "10.14"
version = "10.14.3"
description = "Extension pack for Python Markdown."
optional = false
python-versions = ">=3.8"
files = [
{file = "pymdown_extensions-10.14-py3-none-any.whl", hash = "sha256:202481f716cc8250e4be8fce997781ebf7917701b59652458ee47f2401f818b5"},
{file = "pymdown_extensions-10.14.tar.gz", hash = "sha256:741bd7c4ff961ba40b7528d32284c53bc436b8b1645e8e37c3e57770b8700a34"},
{file = "pymdown_extensions-10.14.3-py3-none-any.whl", hash = "sha256:05e0bee73d64b9c71a4ae17c72abc2f700e8bc8403755a00580b49a4e9f189e9"},
{file = "pymdown_extensions-10.14.3.tar.gz", hash = "sha256:41e576ce3f5d650be59e900e4ceff231e0aed2a88cf30acaee41e02f063a061b"},
]
[package.dependencies]
@@ -3925,13 +3836,13 @@ six = ">=1.5"
[[package]]
name = "pytz"
version = "2024.2"
version = "2025.1"
description = "World timezone definitions, modern and historical"
optional = false
python-versions = "*"
files = [
{file = "pytz-2024.2-py2.py3-none-any.whl", hash = "sha256:31c7c1817eb7fae7ca4b8c7ee50c72f93aa2dd863de768e1ef4245d426aa0725"},
{file = "pytz-2024.2.tar.gz", hash = "sha256:2aa355083c50a0f93fa581709deac0c9ad65cca8a9e9beac660adcbd493c798a"},
{file = "pytz-2025.1-py2.py3-none-any.whl", hash = "sha256:89dd22dca55b46eac6eda23b2d72721bf1bdfef212645d81513ef5d03038de57"},
{file = "pytz-2025.1.tar.gz", hash = "sha256:c2db42be2a2518b28e65f9207c4d05e6ff547d1efa4086469ef855e4ab70178e"},
]
[[package]]
@@ -4039,18 +3950,19 @@ pyyaml = "*"
[[package]]
name = "referencing"
version = "0.35.1"
version = "0.36.2"
description = "JSON Referencing + Python"
optional = false
python-versions = ">=3.8"
python-versions = ">=3.9"
files = [
{file = "referencing-0.35.1-py3-none-any.whl", hash = "sha256:eda6d3234d62814d1c64e305c1331c9a3a6132da475ab6382eaa997b21ee75de"},
{file = "referencing-0.35.1.tar.gz", hash = "sha256:25b42124a6c8b632a425174f24087783efb348a6f1e0008e63cd4466fedf703c"},
{file = "referencing-0.36.2-py3-none-any.whl", hash = "sha256:e8699adbbf8b5c7de96d8ffa0eb5c158b3beafce084968e2ea8bb08c6794dcd0"},
{file = "referencing-0.36.2.tar.gz", hash = "sha256:df2e89862cd09deabbdba16944cc3f10feb6b3e6f18e902f7cc25609a34775aa"},
]
[package.dependencies]
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
typing-extensions = {version = ">=4.4.0", markers = "python_version < \"3.13\""}
[[package]]
name = "regex"
@@ -4431,6 +4343,7 @@ files = [
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f66efbc1caa63c088dead1c4170d148eabc9b80d95fb75b6c92ac0aad2437d76"},
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:22353049ba4181685023b25b5b51a574bce33e7f51c759371a7422dcae5402a6"},
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:932205970b9f9991b34f55136be327501903f7c66830e9760a8ffb15b07f05cd"},
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:a52d48f4e7bf9005e8f0a89209bf9a73f7190ddf0489eee5eb51377385f59f2a"},
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-win32.whl", hash = "sha256:3eac5a91891ceb88138c113f9db04f3cebdae277f5d44eaa3651a4f573e6a5da"},
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-win_amd64.whl", hash = "sha256:ab007f2f5a87bd08ab1499bdf96f3d5c6ad4dcfa364884cb4549aa0154b13a28"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-macosx_13_0_arm64.whl", hash = "sha256:4a6679521a58256a90b0d89e03992c15144c5f3858f40d7c18886023d7943db6"},
@@ -4439,6 +4352,7 @@ files = [
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:811ea1594b8a0fb466172c384267a4e5e367298af6b228931f273b111f17ef52"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:cf12567a7b565cbf65d438dec6cfbe2917d3c1bdddfce84a9930b7d35ea59642"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:7dd5adc8b930b12c8fc5b99e2d535a09889941aa0d0bd06f4749e9a9397c71d2"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:1492a6051dab8d912fc2adeef0e8c72216b24d57bd896ea607cb90bb0c4981d3"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-win32.whl", hash = "sha256:bd0a08f0bab19093c54e18a14a10b4322e1eacc5217056f3c063bd2f59853ce4"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-win_amd64.whl", hash = "sha256:a274fb2cb086c7a3dea4322ec27f4cb5cc4b6298adb583ab0e211a4682f241eb"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:20b0f8dc160ba83b6dcc0e256846e1a02d044e13f7ea74a3d1d56ede4e48c632"},
@@ -4447,6 +4361,7 @@ files = [
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:749c16fcc4a2b09f28843cda5a193e0283e47454b63ec4b81eaa2242f50e4ccd"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:bf165fef1f223beae7333275156ab2022cffe255dcc51c27f066b4370da81e31"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:32621c177bbf782ca5a18ba4d7af0f1082a3f6e517ac2a18b3974d4edf349680"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:b82a7c94a498853aa0b272fd5bc67f29008da798d4f93a2f9f289feb8426a58d"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-win32.whl", hash = "sha256:e8c4ebfcfd57177b572e2040777b8abc537cdef58a2120e830124946aa9b42c5"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-win_amd64.whl", hash = "sha256:0467c5965282c62203273b838ae77c0d29d7638c8a4e3a1c8bdd3602c10904e4"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:4c8c5d82f50bb53986a5e02d1b3092b03622c02c2eb78e29bec33fd9593bae1a"},
@@ -4455,6 +4370,7 @@ files = [
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:96777d473c05ee3e5e3c3e999f5d23c6f4ec5b0c38c098b3a5229085f74236c6"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-musllinux_1_1_i686.whl", hash = "sha256:3bc2a80e6420ca8b7d3590791e2dfc709c88ab9152c00eeb511c9875ce5778bf"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:e188d2699864c11c36cdfdada94d781fd5d6b0071cd9c427bceb08ad3d7c70e1"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4f6f3eac23941b32afccc23081e1f50612bdbe4e982012ef4f5797986828cd01"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-win32.whl", hash = "sha256:6442cb36270b3afb1b4951f060eccca1ce49f3d087ca1ca4563a6eb479cb3de6"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-win_amd64.whl", hash = "sha256:e5b8daf27af0b90da7bb903a876477a9e6d7270be6146906b276605997c7e9a3"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-macosx_12_0_arm64.whl", hash = "sha256:fc4b630cd3fa2cf7fce38afa91d7cfe844a9f75d7f0f36393fa98815e911d987"},
@@ -4463,6 +4379,7 @@ files = [
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e2f1c3765db32be59d18ab3953f43ab62a761327aafc1594a2a1fbe038b8b8a7"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:d85252669dc32f98ebcd5d36768f5d4faeaeaa2d655ac0473be490ecdae3c285"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:e143ada795c341b56de9418c58d028989093ee611aa27ffb9b7f609c00d813ed"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:2c59aa6170b990d8d2719323e628aaf36f3bfbc1c26279c0eeeb24d05d2d11c7"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-win32.whl", hash = "sha256:beffaed67936fbbeffd10966a4eb53c402fafd3d6833770516bf7314bc6ffa12"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-win_amd64.whl", hash = "sha256:040ae85536960525ea62868b642bdb0c2cc6021c9f9d507810c0c604e66f5a7b"},
{file = "ruamel.yaml.clib-0.2.12.tar.gz", hash = "sha256:6c8fbb13ec503f99a91901ab46e0b07ae7941cd527393187039aec586fdfd36f"},
@@ -4701,21 +4618,6 @@ files = [
[package.extras]
widechars = ["wcwidth"]
[[package]]
name = "tenacity"
version = "9.0.0"
description = "Retry code until it succeeds"
optional = false
python-versions = ">=3.8"
files = [
{file = "tenacity-9.0.0-py3-none-any.whl", hash = "sha256:93de0c98785b27fcf659856aa9f54bfbd399e29969b0621bc7f762bd441b4539"},
{file = "tenacity-9.0.0.tar.gz", hash = "sha256:807f37ca97d62aa361264d497b0e31e92b8027044942bfa756160d908320d73b"},
]
[package.extras]
doc = ["reno", "sphinx"]
test = ["pytest", "tornado (>=4.5)", "typeguard"]
[[package]]
name = "tldextract"
version = "5.1.3"
@@ -4819,13 +4721,13 @@ files = [
[[package]]
name = "tzdata"
version = "2024.2"
version = "2025.1"
description = "Provider of IANA time zone data"
optional = false
python-versions = ">=2"
files = [
{file = "tzdata-2024.2-py2.py3-none-any.whl", hash = "sha256:a48093786cdcde33cad18c2555e8532f34422074448fbc874186f0abd79565cd"},
{file = "tzdata-2024.2.tar.gz", hash = "sha256:7d85cc416e9382e69095b7bdf4afd9e3880418a2413feec7069d533d6b4e31cc"},
{file = "tzdata-2025.1-py2.py3-none-any.whl", hash = "sha256:7e127113816800496f027041c570f50bcd464a020098a3b6b199517772303639"},
{file = "tzdata-2025.1.tar.gz", hash = "sha256:24894909e88cdb28bd1636c6887801df64cb485bd593f2fd83ef29075a81d694"},
]
[[package]]
@@ -5068,13 +4970,13 @@ files = [
[[package]]
name = "xlsxwriter"
version = "3.2.0"
version = "3.2.2"
description = "A Python module for creating Excel XLSX files."
optional = false
python-versions = ">=3.6"
files = [
{file = "XlsxWriter-3.2.0-py3-none-any.whl", hash = "sha256:ecfd5405b3e0e228219bcaf24c2ca0915e012ca9464a14048021d21a995d490e"},
{file = "XlsxWriter-3.2.0.tar.gz", hash = "sha256:9977d0c661a72866a61f9f7a809e25ebbb0fb7036baa3b9fe74afcfca6b3cb8c"},
{file = "XlsxWriter-3.2.2-py3-none-any.whl", hash = "sha256:272ce861e7fa5e82a4a6ebc24511f2cb952fde3461f6c6e1a1e81d3272db1471"},
{file = "xlsxwriter-3.2.2.tar.gz", hash = "sha256:befc7f92578a85fed261639fb6cde1fd51b79c5e854040847dde59d4317077dc"},
]
[[package]]
@@ -5206,4 +5108,4 @@ type = ["pytest-mypy"]
[metadata]
lock-version = "2.0"
python-versions = ">=3.9,<3.13"
content-hash = "53401f345a72f27b8f1be3dc09ed822d1f65f1fbbf1fd5ca1cb99445d5cfae94"
content-hash = "b479cce4f4992dab6f81b8118082daad2fd301cf01cbced02c90bc132dd27285"

View File

@@ -51,6 +51,7 @@ from prowler.lib.outputs.compliance.cis.cis_aws import AWSCIS
from prowler.lib.outputs.compliance.cis.cis_azure import AzureCIS
from prowler.lib.outputs.compliance.cis.cis_gcp import GCPCIS
from prowler.lib.outputs.compliance.cis.cis_kubernetes import KubernetesCIS
from prowler.lib.outputs.compliance.cis.cis_microsoft365 import Microsoft365CIS
from prowler.lib.outputs.compliance.compliance import display_compliance_table
from prowler.lib.outputs.compliance.ens.ens_aws import AWSENS
from prowler.lib.outputs.compliance.ens.ens_azure import AzureENS
@@ -78,6 +79,7 @@ from prowler.providers.common.provider import Provider
from prowler.providers.common.quick_inventory import run_provider_quick_inventory
from prowler.providers.gcp.models import GCPOutputOptions
from prowler.providers.kubernetes.models import KubernetesOutputOptions
from prowler.providers.microsoft365.models import Microsoft365OutputOptions
def prowler():
@@ -259,6 +261,10 @@ def prowler():
output_options = KubernetesOutputOptions(
args, bulk_checks_metadata, global_provider.identity
)
elif provider == "microsoft365":
output_options = Microsoft365OutputOptions(
args, bulk_checks_metadata, global_provider.identity
)
# Run the quick inventory for the provider if available
if hasattr(args, "quick_inventory") and args.quick_inventory:
@@ -307,7 +313,6 @@ def prowler():
if "SLACK_API_TOKEN" in environ and (
"SLACK_CHANNEL_NAME" in environ or "SLACK_CHANNEL_ID" in environ
):
token = environ["SLACK_API_TOKEN"]
channel = (
environ["SLACK_CHANNEL_NAME"]
@@ -632,6 +637,36 @@ def prowler():
generated_outputs["compliance"].append(generic_compliance)
generic_compliance.batch_write_data_to_file()
elif provider == "microsoft365":
for compliance_name in input_compliance_frameworks:
if compliance_name.startswith("cis_"):
# Generate CIS Finding Object
filename = (
f"{output_options.output_directory}/compliance/"
f"{output_options.output_filename}_{compliance_name}.csv"
)
cis = Microsoft365CIS(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(cis)
cis.batch_write_data_to_file()
else:
filename = (
f"{output_options.output_directory}/compliance/"
f"{output_options.output_filename}_{compliance_name}.csv"
)
generic_compliance = GenericCompliance(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(generic_compliance)
generic_compliance.batch_write_data_to_file()
# AWS Security Hub Integration
if provider == "aws":
# Send output to S3 if needed (-B / -D) for all the output formats

View File

@@ -28,7 +28,9 @@
"Service": "ebs"
}
],
"Checks": []
"Checks": [
"ec2_ebs_volume_snapshots_exists"
]
},
{
"Id": "1.0.3",
@@ -42,7 +44,8 @@
}
],
"Checks": [
"ec2_ebs_default_encryption"
"ec2_ebs_default_encryption",
"ec2_ebs_volume_encryption"
]
},
{
@@ -87,7 +90,9 @@
}
],
"Checks": [
"iam_user_mfa_enabled_console_access"
"iam_user_mfa_enabled_console_access",
"iam_user_hardware_mfa_enabled",
"iam_root_mfa_enabled"
]
},
{
@@ -102,7 +107,9 @@
}
],
"Checks": [
"iam_user_mfa_enabled_console_access"
"iam_user_mfa_enabled_console_access",
"iam_user_hardware_mfa_enabled",
"iam_root_mfa_enabled"
]
},
{
@@ -117,7 +124,9 @@
}
],
"Checks": [
"iam_root_mfa_enabled"
"iam_root_mfa_enabled",
"iam_root_hardware_mfa_enabled",
"iam_user_mfa_enabled_console_access"
]
},
{
@@ -162,7 +171,10 @@
}
],
"Checks": [
"rds_instance_no_public_access"
"rds_instance_no_public_access",
"s3_bucket_public_access",
"s3_bucket_public_list_acl",
"s3_account_level_public_access_blocks"
]
},
{
@@ -192,7 +204,8 @@
}
],
"Checks": [
"rds_instance_storage_encrypted"
"rds_instance_storage_encrypted",
"rds_instance_transport_encrypted"
]
},
{

View File

@@ -584,7 +584,8 @@
"Id": "2.3.1",
"Description": "Ensure that encryption is enabled for RDS Instances",
"Checks": [
"rds_instance_storage_encrypted"
"rds_instance_storage_encrypted",
"rds_instance_transport_encrypted"
],
"Attributes": [
{

View File

@@ -584,7 +584,8 @@
"Id": "2.3.1",
"Description": "Ensure that encryption is enabled for RDS Instances",
"Checks": [
"rds_instance_storage_encrypted"
"rds_instance_storage_encrypted",
"rds_instance_transport_encrypted"
],
"Attributes": [
{

View File

@@ -303,7 +303,9 @@
{
"Id": "1.22",
"Description": "Ensure access to AWSCloudShellFullAccess is restricted",
"Checks": [],
"Checks": [
"iam_policy_cloudshell_admin_not_attached"
],
"Attributes": [
{
"Section": "1. Identity and Access Management",
@@ -492,7 +494,8 @@
"Id": "2.1.2",
"Description": "Ensure MFA Delete is enabled on S3 buckets",
"Checks": [
"s3_bucket_no_mfa_delete"
"s3_bucket_no_mfa_delete",
"cloudtrail_bucket_requires_mfa_delete"
],
"Attributes": [
{
@@ -581,7 +584,8 @@
"Id": "2.3.1",
"Description": "Ensure that encryption is enabled for RDS Instances",
"Checks": [
"rds_instance_storage_encrypted"
"rds_instance_storage_encrypted",
"rds_instance_transport_encrypted"
],
"Attributes": [
{
@@ -1347,7 +1351,8 @@
"Id": "5.6",
"Description": "Ensure that EC2 Metadata Service only allows IMDSv2",
"Checks": [
"ec2_instance_imdsv2_enabled"
"ec2_instance_imdsv2_enabled",
"ec2_instance_account_imdsv2_enabled"
],
"Attributes": [
{

View File

@@ -0,0 +1,134 @@
{
"Framework": "CIS",
"Version": "4.0",
"Provider": "Microsoft365",
"Description": "The CIS Microsoft 365 Foundations Benchmark provides prescriptive guidance for establishing a secure configuration posture for Microsoft 365 Cloud offerings running on any OS.",
"Requirements": [
{
"Id": "1.1.1",
"Description": "Ensure Administrative accounts are cloud-only",
"Checks": [],
"Attributes": [
{
"Section": "1.Microsoft 365 admin center",
"Profile": "Level 1",
"AssessmentStatus": "Manual",
"Description": "Administrative accounts are special privileged accounts that could have varying levels of access to data, users, and settings. Regular user accounts should never be utilized for administrative tasks and care should be taken, in the case of a hybrid environment, to keep Administrative accounts separated from on-prem accounts. Administrative accounts should not have applications assigned so that they have no access to potentially vulnerable services (e.g., email, Teams, SharePoint, etc.) and only access to perform tasks as needed for administrative purposes. Ensure administrative accounts are not On-premises sync enabled.",
"RationaleStatement": "In a hybrid environment, having separate accounts will help ensure that in the event of a breach in the cloud, that the breach does not affect the on-prem environment and vice versa.",
"ImpactStatement": "Administrative users will have to switch accounts and utilize login/logout functionality when performing administrative tasks, as well as not benefiting from SSO. This will require a migration process from the 'daily driver' account to a dedicated admin account. When migrating permissions to the admin account, both M365 and Azure RBAC roles should be migrated as well. Once the new admin accounts are created, both of these permission sets should be moved from the daily driver account to the new admin account. Failure to migrate Azure RBAC roles can cause an admin to not be able to see their subscriptions/resources while using their admin accounts.",
"RemediationProcedure": "Review all administrative accounts and ensure they are configured as cloud-only. Remove any on-premises synchronization for these accounts. Assign necessary roles and permissions exclusively to the dedicated cloud administrative accounts.",
"AuditProcedure": "Log in to the Microsoft 365 Admin Center and review the list of administrative accounts. Verify that none of them are on-premises sync enabled.",
"AdditionalInformation": "This recommendation is particularly relevant for hybrid environments and is aimed at enhancing the security of administrative accounts by isolating them from on-prem infrastructure.",
"DefaultValue": "By default, administrative accounts may be either cloud-only or hybrid. This setting needs to be verified and adjusted according to the recommendation.",
"References": "CIS Microsoft 365 Foundations Benchmark v4.0, Section 1.1.1"
}
]
},
{
"Id": "1.1.2",
"Description": "Ensure two emergency access accounts have been defined",
"Checks": [],
"Attributes": [
{
"Section": "1.Microsoft 365 admin center",
"Profile": "Level 1",
"AssessmentStatus": "Manual",
"Description": "Emergency access or 'break glass' accounts are limited for emergency scenarios where normal administrative accounts are unavailable. They are not assigned to a specific user and will have a combination of physical and technical controls to prevent them from being accessed outside a true emergency. These emergencies could include technical failures of a cellular provider or Microsoft-related service such as MFA, or the last remaining Global Administrator account becoming inaccessible. Ensure two Emergency Access accounts have been defined.",
"RationaleStatement": "In various situations, an organization may require the use of a break glass account to gain emergency access. Losing access to administrative functions could result in a significant loss of support capability, reduced visibility into the security posture, and potential financial losses.",
"ImpactStatement": "Improper implementation of emergency access accounts could weaken the organization's security posture. To mitigate this, at least one account should be excluded from all conditional access rules, and strong authentication mechanisms (e.g., long, high-entropy passwords or FIDO2 security keys) must be used to secure the accounts.",
"RemediationProcedure": "Create two emergency access accounts and configure them according to Microsoft's recommendations. Ensure that these accounts are not assigned to specific users and are excluded from all conditional access rules. Secure the accounts with strong passwords or passwordless authentication methods, such as FIDO2 security keys. Regularly review and test these accounts to confirm their functionality.",
"AuditProcedure": "Log in to the Microsoft 365 Admin Center and verify the existence of at least two emergency access accounts. Check their configurations to ensure they comply with Microsoft's recommendations, including exclusion from conditional access rules and proper security settings.",
"AdditionalInformation": "Emergency access accounts are critical for maintaining administrative control during unexpected events. Regular audits and strict access controls are recommended to prevent misuse.",
"DefaultValue": "By default, emergency access accounts are not configured. Organizations must create and secure these accounts proactively.",
"References": "CIS Microsoft 365 Foundations Benchmark v4.0, Section 1.1.2; Microsoft documentation on emergency access accounts."
}
]
},
{
"Id": "1.1.3",
"Description": "Ensure that between two and four global admins are designated",
"Checks": [
"admincenter_users_between_two_and_four_global_admins"
],
"Attributes": [
{
"Section": "1.Microsoft 365 admin center",
"Profile": "Level 1",
"AssessmentStatus": "Automated",
"Description": "More than one global administrator should be designated so a single admin can be monitored and to provide redundancy should a single admin leave an organization. Additionally, there should be no more than four global admins set for any tenant. Ideally, global administrators will have no licenses assigned to them.",
"RationaleStatement": "If there is only one global tenant administrator, he or she can perform malicious activity without the possibility of being discovered by another admin. If there are numerous global tenant administrators, the more likely it is that one of their accounts will be successfully breached by an external attacker.",
"ImpactStatement": "If there is only one global administrator in a tenant, an additional global administrator will need to be identified and configured. If there are more than four global administrators, a review of role requirements for current global administrators will be required to identify which of the users require global administrator access.",
"RemediationProcedure": "Review the list of global administrators in the tenant and ensure there are at least two but no more than four accounts assigned this role. Remove excess global administrator accounts and create additional ones if necessary. Avoid assigning licenses to these accounts.",
"AuditProcedure": "Log in to the Microsoft 365 Admin Center and review the list of global administrators. Verify that there are at least two but no more than four global administrators configured.",
"AdditionalInformation": "Global administrators play a critical role in tenant management. Ensuring a proper number of global administrators improves redundancy and security.",
"DefaultValue": "By default, there may be a single global administrator configured for the tenant. Organizations need to manually adjust the count as per best practices.",
"References": "CIS Microsoft 365 Foundations Benchmark v4.0, Section 1.1.3"
}
]
},
{
"Id": "1.1.4",
"Description": "Ensure administrative accounts use licenses with a reduced application footprint",
"Checks": [
"admincenter_users_admins_reduced_license_footprint"
],
"Attributes": [
{
"Section": "1.Microsoft 365 admin center",
"Profile": "Level 1",
"AssessmentStatus": "Automated",
"Description": "Administrative accounts are special privileged accounts with varying levels of access to data, users, and settings. It is recommended that privileged accounts either not be licensed or use Microsoft Entra ID P1 or Microsoft Entra ID P2 licenses to minimize application exposure.",
"RationaleStatement": "Ensuring administrative accounts do not use licenses with applications assigned to them reduces the attack surface of high-privileged identities. This minimizes the likelihood of these accounts being targeted by social engineering attacks or exposed to malicious content via licensed applications. Administrative activities should be restricted to dedicated accounts without access to collaborative tools like mailboxes.",
"ImpactStatement": "Administrative users will need to switch accounts to perform privileged actions, requiring login/logout functionality and potentially losing the convenience of SSO. Alerts sent to Global Administrators or TenantAdmins by default might not be received if these accounts lack application-based licenses. Proper alert routing must be configured to avoid missed notifications.",
"RemediationProcedure": "Review the licenses assigned to administrative accounts. Remove licenses granting access to collaborative applications and assign Microsoft Entra ID P1 or P2 licenses if participation in Microsoft 365 security services is required. Configure alerts to be sent to valid email addresses for monitoring.",
"AuditProcedure": "Log in to the Microsoft 365 Admin Center and review the licenses assigned to administrative accounts. Confirm that administrative accounts either have no licenses or are limited to Microsoft Entra ID P1 or P2 licenses without collaborative applications enabled.",
"AdditionalInformation": "Reducing the application footprint of administrative accounts improves security by minimizing potential attack vectors. Special care should be taken to configure alert routing properly to ensure critical notifications are not missed.",
"DefaultValue": "By default, administrative accounts may have licenses assigned based on organizational setup. Manual review and adjustment are necessary to comply with this recommendation.",
"References": "CIS Microsoft 365 Foundations Benchmark v4.0, Section 1.1.4; Microsoft documentation on Entra ID licenses and privileged account security."
}
]
},
{
"Id": "1.2.1",
"Description": "Ensure that only organizationally managed/approved public groups exist",
"Checks": [
"admincenter_groups_not_public_visibility"
],
"Attributes": [
{
"Section": "1.Microsoft 365 admin center",
"Profile": "Level 2",
"AssessmentStatus": "Automated",
"Description": "Microsoft 365 Groups enable shared resource access across Microsoft 365 services. The default privacy setting for groups is 'Public,' which allows users within the organization to access the group's resources. Ensure that only organizationally managed and approved public groups exist to prevent unauthorized access to sensitive information.",
"RationaleStatement": "Public groups can be accessed by any user within the organization via several methods, such as self-adding through the Azure portal, sending an access request, or directly using the SharePoint URL. Without control over group privacy, sensitive organizational data might be exposed to unintended users.",
"ImpactStatement": "Implementing this recommendation may result in an increased volume of access requests for group owners, particularly for groups previously intended to be public.",
"RemediationProcedure": "Audit all Microsoft 365 public groups and ensure they are organizationally managed and approved. Convert unnecessary public groups to private groups and enforce strict policies for creating and approving public groups. Group owners should be instructed to monitor and review access requests.",
"AuditProcedure": "Log in to the Microsoft 365 Admin Center and review the list of public groups. Verify that all public groups have been approved and are necessary for organizational purposes.",
"AdditionalInformation": "Public groups expose data to all users within the organization, increasing the risk of accidental or unauthorized access. Periodic reviews of group privacy settings are recommended.",
"DefaultValue": "By default, groups created in Microsoft 365 are set to 'Public' privacy.",
"References": "CIS Microsoft 365 Foundations Benchmark v4.0, Section 1.2.1; Microsoft documentation on managing group privacy."
}
]
},
{
"Id": "1.2.2",
"Description": "Ensure sign-in to shared mailboxes is blocked",
"Checks": [],
"Attributes": [
{
"Section": "1.Microsoft 365 admin center",
"Profile": "Level 1",
"AssessmentStatus": "Manuel",
"Description": "Shared mailboxes are used when multiple people need access to the same mailbox for functions such as support or reception. These mailboxes are created with a corresponding user account that includes a system-generated password. To enhance security, sign-in should be blocked for these shared mailbox accounts, ensuring access is granted only through delegation.",
"RationaleStatement": "Blocking sign-in for shared mailbox accounts prevents unauthorized access or direct sign-in, ensuring that all interactions with the shared mailbox are through authorized delegation. This reduces the risk of attackers exploiting shared mailboxes for malicious purposes such as sending emails with spoofed identities.",
"ImpactStatement": "Blocking sign-in to shared mailboxes requires users to access these mailboxes only through delegation. Administrators will need to monitor and ensure proper access permissions are assigned.",
"RemediationProcedure": "Log in to the Microsoft 365 Admin Center and locate the shared mailboxes. For each shared mailbox, verify that sign-in is blocked by reviewing the associated user account settings. If sign-in is not blocked, adjust the account settings to enforce this configuration.",
"AuditProcedure": "Review all shared mailboxes in the Microsoft 365 Admin Center. Confirm that the user accounts associated with these mailboxes have sign-in blocked.",
"AdditionalInformation": "Shared mailboxes are often a target for exploitation due to their broad access and functional role. Blocking sign-in significantly reduces the attack surface.",
"DefaultValue": "By default, shared mailboxes may have sign-in enabled unless explicitly configured otherwise.",
"References": "CIS Microsoft 365 Foundations Benchmark v4.0, Section 1.2.2; Microsoft documentation on managing shared mailboxes."
}
]
}
]
}

View File

@@ -12,7 +12,7 @@ from prowler.lib.logger import logger
timestamp = datetime.today()
timestamp_utc = datetime.now(timezone.utc).replace(tzinfo=timezone.utc)
prowler_version = "5.2.3"
prowler_version = "5.3.0"
html_logo_url = "https://github.com/prowler-cloud/prowler/"
square_logo_img = "https://prowler.com/wp-content/uploads/logo-html.png"
aws_logo = "https://user-images.githubusercontent.com/38561120/235953920-3e3fba08-0795-41dc-b480-9bea57db9f2e.png"
@@ -28,6 +28,7 @@ class Provider(str, Enum):
GCP = "gcp"
AZURE = "azure"
KUBERNETES = "kubernetes"
MICROSOFT365 = "microsoft365"
# Compliance
@@ -122,7 +123,10 @@ def load_and_validate_config_file(provider: str, config_file_path: str) -> dict:
# Not to introduce a breaking change, allow the old format config file without any provider keys
# and a new format with a key for each provider to include their configuration values within.
if any(key in config_file for key in ["aws", "gcp", "azure", "kubernetes"]):
if any(
key in config_file
for key in ["aws", "gcp", "azure", "kubernetes", "microsoft365"]
):
config = config_file.get(provider, {})
else:
config = config_file if config_file else {}

View File

@@ -0,0 +1,44 @@
### Account, Check and/or Region can be * to apply for all the cases.
### Account == Microsoft365 Tenant and Region == Microsoft365 Location
### Resources and tags are lists that can have either Regex or Keywords.
### Tags is an optional list that matches on tuples of 'key=value' and are "ANDed" together.
### Use an alternation Regex to match one of multiple tags with "ORed" logic.
### For each check you can except Accounts, Regions, Resources and/or Tags.
########################### MUTELIST EXAMPLE ###########################
Mutelist:
Accounts:
"*":
Checks:
"admincenter_groups_not_public_visibility":
Regions:
- "westeurope"
Resources:
- "sqlserver1" # Will ignore sqlserver1 in check sqlserver_tde_encryption_enabled located in westeurope
Description: "Findings related with the check sqlserver_tde_encryption_enabled is muted for westeurope region and sqlserver1 resource"
"defender_*":
Regions:
- "*"
Resources:
- "*" # Will ignore every Defender check in every location
"*":
Regions:
- "*"
Resources:
- "test"
Tags:
- "test=test" # Will ignore every resource containing the string "test" and the tags 'test=test' and
- "project=test|project=stage" # either of ('project=test' OR project=stage) in Azure subscription 1 and every location
"*":
Checks:
"admincenter_*":
Regions:
- "*"
Resources:
- "*"
Exceptions:
Accounts:
- "Tenant1"
Regions:
- "eastus"
- "eastus2" # Will ignore every resource in admincenter checks except the ones in Tenant1 located in eastus or eastus2

View File

@@ -535,6 +535,29 @@ class Check_Report_Kubernetes(Check_Report):
self.namespace = "cluster-wide"
@dataclass
class Check_Report_Microsoft365(Check_Report):
"""Contains the Microsoft365 Check's finding information."""
resource_name: str
resource_id: str
location: str
def __init__(self, metadata: Dict, resource: Any) -> None:
"""Initialize the Microsoft365 Check's finding information.
Args:
metadata: The metadata of the check.
resource: Basic information about the resource. Defaults to None.
"""
super().__init__(metadata, resource)
self.resource_name = getattr(
resource, "name", getattr(resource, "resource_name", "")
)
self.resource_id = getattr(resource, "id", getattr(resource, "resource_id", ""))
self.location = getattr(resource, "location", "global")
# Testing Pending
def load_check_metadata(metadata_file: str) -> CheckMetadata:
"""

View File

@@ -26,7 +26,7 @@ class ProwlerArgumentParser:
self.parser = argparse.ArgumentParser(
prog="prowler",
formatter_class=RawTextHelpFormatter,
usage="prowler [-h] [--version] {aws,azure,gcp,kubernetes,dashboard} ...",
usage="prowler [-h] [--version] {aws,azure,gcp,kubernetes,microsoft365,dashboard} ...",
epilog="""
Available Cloud Providers:
{aws,azure,gcp,kubernetes}
@@ -34,6 +34,7 @@ Available Cloud Providers:
azure Azure Provider
gcp GCP Provider
kubernetes Kubernetes Provider
microsoft365 Microsoft 365 Provider
Available components:
dashboard Local dashboard
@@ -72,7 +73,7 @@ Detailed documentation at https://docs.prowler.com
# Init Providers Arguments
init_providers_parser(self)
# Dahboard Parser
# Dashboard Parser
init_dashboard_parser(self)
def parse(self, args=None) -> argparse.Namespace:

View File

@@ -0,0 +1,99 @@
from prowler.lib.check.compliance_models import Compliance
from prowler.lib.outputs.compliance.cis.models import Microsoft365CISModel
from prowler.lib.outputs.compliance.compliance_output import ComplianceOutput
from prowler.lib.outputs.finding import Finding
class Microsoft365CIS(ComplianceOutput):
"""
This class represents the Azure CIS compliance output.
Attributes:
- _data (list): A list to store transformed data from findings.
- _file_descriptor (TextIOWrapper): A file descriptor to write data to a file.
Methods:
- transform: Transforms findings into Azure CIS compliance format.
"""
def transform(
self,
findings: list[Finding],
compliance: Compliance,
compliance_name: str,
) -> None:
"""
Transforms a list of findings into Azure CIS compliance format.
Parameters:
- findings (list): A list of findings.
- compliance (Compliance): A compliance model.
- compliance_name (str): The name of the compliance model.
Returns:
- None
"""
for finding in findings:
# Get the compliance requirements for the finding
finding_requirements = finding.compliance.get(compliance_name, [])
for requirement in compliance.Requirements:
if requirement.Id in finding_requirements:
for attribute in requirement.Attributes:
compliance_row = Microsoft365CISModel(
Provider=finding.provider,
Description=compliance.Description,
TenantId=finding.account_uid,
Location=finding.region,
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_Profile=attribute.Profile,
Requirements_Attributes_AssessmentStatus=attribute.AssessmentStatus,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_RationaleStatement=attribute.RationaleStatement,
Requirements_Attributes_ImpactStatement=attribute.ImpactStatement,
Requirements_Attributes_RemediationProcedure=attribute.RemediationProcedure,
Requirements_Attributes_AuditProcedure=attribute.AuditProcedure,
Requirements_Attributes_AdditionalInformation=attribute.AdditionalInformation,
Requirements_Attributes_DefaultValue=attribute.DefaultValue,
Requirements_Attributes_References=attribute.References,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_uid,
ResourceName=finding.resource_name,
CheckId=finding.check_id,
Muted=finding.muted,
)
self._data.append(compliance_row)
# Add manual requirements to the compliance output
for requirement in compliance.Requirements:
if not requirement.Checks:
for attribute in requirement.Attributes:
compliance_row = Microsoft365CISModel(
Provider=compliance.Provider.lower(),
Description=compliance.Description,
TenantId=finding.account_uid,
Location=finding.region,
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_Profile=attribute.Profile,
Requirements_Attributes_AssessmentStatus=attribute.AssessmentStatus,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_RationaleStatement=attribute.RationaleStatement,
Requirements_Attributes_ImpactStatement=attribute.ImpactStatement,
Requirements_Attributes_RemediationProcedure=attribute.RemediationProcedure,
Requirements_Attributes_AuditProcedure=attribute.AuditProcedure,
Requirements_Attributes_AdditionalInformation=attribute.AdditionalInformation,
Requirements_Attributes_DefaultValue=attribute.DefaultValue,
Requirements_Attributes_References=attribute.References,
Status="MANUAL",
StatusExtended="Manual check",
ResourceId="manual_check",
ResourceName="Manual check",
CheckId="manual",
Muted=False,
)
self._data.append(compliance_row)

View File

@@ -66,6 +66,37 @@ class AzureCISModel(BaseModel):
Muted: bool
class Microsoft365CISModel(BaseModel):
"""
Microsoft365CISModel generates a finding's output in Microsoft365 CIS Compliance format.
"""
Provider: str
Description: str
TenantId: str
Location: str
AssessmentDate: str
Requirements_Id: str
Requirements_Description: str
Requirements_Attributes_Section: str
Requirements_Attributes_Profile: str
Requirements_Attributes_AssessmentStatus: str
Requirements_Attributes_Description: str
Requirements_Attributes_RationaleStatement: str
Requirements_Attributes_ImpactStatement: str
Requirements_Attributes_RemediationProcedure: str
Requirements_Attributes_AuditProcedure: str
Requirements_Attributes_AdditionalInformation: str
Requirements_Attributes_DefaultValue: str
Requirements_Attributes_References: str
Status: str
StatusExtended: str
ResourceId: str
ResourceName: str
CheckId: str
Muted: bool
class GCPCISModel(BaseModel):
"""
GCPCISModel generates a finding's output in GCP CIS Compliance format.

View File

@@ -234,6 +234,20 @@ class Finding(BaseModel):
)
output_data["region"] = f"namespace: {check_output.namespace}"
elif provider.type == "microsoft365":
output_data["auth_method"] = (
f"{provider.identity.identity_type}: {provider.identity.identity_id}"
)
output_data["account_uid"] = get_nested_attribute(
provider, "identity.tenant_id"
)
output_data["account_name"] = get_nested_attribute(
provider, "identity.tenant_domain"
)
output_data["resource_name"] = check_output.resource_name
output_data["resource_uid"] = check_output.resource_id
output_data["region"] = check_output.location
# check_output Unique ID
# TODO: move this to a function
# TODO: in Azure, GCP and K8s there are fidings without resource_name

View File

@@ -205,7 +205,7 @@ class HTML(Output):
<th scope="col">Resource Tags</th>
<th scope="col">Status Extended</th>
<th scope="col">Risk</th>
<th scope="col">Recomendation</th>
<th scope="col">Recommendation</th>
<th scope="col">Compliance</th>
</tr>
</thead>
@@ -538,6 +538,52 @@ class HTML(Output):
)
return ""
@staticmethod
def get_microsoft365_assessment_summary(provider: Provider) -> str:
"""
get_microsoft365_assessment_summary gets the HTML assessment summary for the provider
Args:
provider (Provider): the provider object
Returns:
str: the HTML assessment summary
"""
try:
return f"""
<div class="col-md-2">
<div class="card">
<div class="card-header">
Microsoft365 Assessment Summary
</div>
<ul class="list-group list-group-flush">
<li class="list-group-item">
<b>Microsoft365 Tenant Domain:</b> {provider.identity.tenant_domain}
</li>
</ul>
</div>
</div>
<div class="col-md-4">
<div class="card">
<div class="card-header">
Microsoft365 Credentials
</div>
<ul class="list-group list-group-flush">
<li class="list-group-item">
<b>Microsoft365 Identity Type:</b> {provider.identity.identity_type}
</li>
<li class="list-group-item">
<b>Microsoft365 Identity ID:</b> {provider.identity.identity_id}
</li>
</ul>
</div>
</div>"""
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
return ""
@staticmethod
def get_assessment_summary(provider: Provider) -> str:
"""

View File

@@ -13,6 +13,8 @@ def stdout_report(finding, color, verbose, status, fix):
details = finding.location.lower()
if finding.check_metadata.Provider == "kubernetes":
details = finding.namespace.lower()
if finding.check_metadata.Provider == "microsoft365":
details = finding.location
if (verbose or fix) and (not status or finding.status in status):
if finding.muted:

View File

@@ -40,6 +40,9 @@ def display_summary_table(
elif provider.type == "kubernetes":
entity_type = "Context"
audited_entities = provider.identity.context
elif provider.type == "microsoft365":
entity_type = "Tenant Domain"
audited_entities = provider.identity.tenant_domain
# Check if there are findings and that they are not all MANUAL
if findings and not all(finding.status == "MANUAL" for finding in findings):

View File

@@ -145,6 +145,7 @@
"ap-southeast-3",
"ap-southeast-4",
"ap-southeast-5",
"ap-southeast-7",
"ca-central-1",
"ca-west-1",
"eu-central-1",
@@ -158,6 +159,7 @@
"il-central-1",
"me-central-1",
"me-south-1",
"mx-central-1",
"sa-east-1",
"us-east-1",
"us-east-2",
@@ -6782,6 +6784,7 @@
"mcs": {
"regions": {
"aws": [
"af-south-1",
"ap-east-1",
"ap-northeast-1",
"ap-northeast-2",
@@ -7947,6 +7950,7 @@
"ap-southeast-3",
"ap-southeast-4",
"ca-central-1",
"ca-west-1",
"eu-central-1",
"eu-central-2",
"eu-north-1",
@@ -7955,6 +7959,7 @@
"eu-west-1",
"eu-west-2",
"eu-west-3",
"il-central-1",
"me-central-1",
"me-south-1",
"sa-east-1",
@@ -9358,6 +9363,7 @@
"ap-southeast-2",
"ap-southeast-3",
"ap-southeast-4",
"ap-southeast-5",
"ca-central-1",
"ca-west-1",
"eu-central-1",

View File

@@ -1,4 +1,3 @@
import hashlib
import json
from prowler.lib.check.models import Check, Check_Report_AWS
@@ -28,19 +27,13 @@ class awslambda_function_no_secrets_in_variables(Check):
"detect_secrets_plugins",
),
)
original_env_vars = {}
original_env_vars = []
for name, value in function.environment.items():
original_env_vars.update(
{
hashlib.sha1( # nosec B324 SHA1 is used here for non-security-critical unique identifiers
value.encode("utf-8")
).hexdigest(): name
}
)
original_env_vars.append(name)
if detect_secrets_output:
secrets_string = ", ".join(
[
f"{secret['type']} in variable {original_env_vars[secret['hashed_secret']]}"
f"{secret['type']} in variable {original_env_vars[secret['line_number'] - 2]}"
for secret in detect_secrets_output
]
)

View File

@@ -10,14 +10,14 @@ class awslambda_function_not_publicly_accessible(Check):
report = Check_Report_AWS(metadata=self.metadata(), resource=function)
report.status = "PASS"
report.status_extended = f"Lambda function {function.name} has a policy resource-based policy not public."
report.status_extended = f"Lambda function {function.name} has a resource-based policy without public access."
if is_policy_public(
function.policy,
awslambda_client.audited_account,
is_cross_account_allowed=True,
):
report.status = "FAIL"
report.status_extended = f"Lambda function {function.name} has a policy resource-based policy with public access."
report.status_extended = f"Lambda function {function.name} has a resource-based policy with public access."
findings.append(report)

View File

@@ -14,14 +14,20 @@ class cloudfront_distributions_origin_traffic_encrypted(Check):
unencrypted_origins = []
for origin in distribution.origins:
if (
origin.origin_protocol_policy == ""
or origin.origin_protocol_policy == "http-only"
) or (
origin.origin_protocol_policy == "match-viewer"
and distribution.viewer_protocol_policy == "allow-all"
):
unencrypted_origins.append(origin.id)
if origin.s3_origin_config:
# For S3, only check the viewer protocol policy
if distribution.viewer_protocol_policy == "allow-all":
unencrypted_origins.append(origin.id)
else:
# Regular check for custom origins (ALB, EC2, API Gateway, etc.)
if (
origin.origin_protocol_policy == ""
or origin.origin_protocol_policy == "http-only"
) or (
origin.origin_protocol_policy == "match-viewer"
and distribution.viewer_protocol_policy == "allow-all"
):
unencrypted_origins.append(origin.id)
if unencrypted_origins:
report.status = "FAIL"

View File

@@ -107,21 +107,45 @@ class DirectoryService(AWSService):
if directory.region == regional_client.region:
# Operation is not supported for Shared MicrosoftAD directories.
if directory.type != DirectoryType.SharedMicrosoftAD:
describe_event_topics_parameters = {"DirectoryId": directory.id}
event_topics = []
describe_event_topics = regional_client.describe_event_topics(
**describe_event_topics_parameters
)
for event_topic in describe_event_topics["EventTopics"]:
event_topics.append(
EventTopics(
topic_arn=event_topic["TopicArn"],
topic_name=event_topic["TopicName"],
status=event_topic["Status"],
created_date_time=event_topic["CreatedDateTime"],
try:
describe_event_topics_parameters = {
"DirectoryId": directory.id
}
event_topics = []
describe_event_topics = (
regional_client.describe_event_topics(
**describe_event_topics_parameters
)
)
self.directories[directory.id].event_topics = event_topics
for event_topic in describe_event_topics["EventTopics"]:
event_topics.append(
EventTopics(
topic_arn=event_topic["TopicArn"],
topic_name=event_topic["TopicName"],
status=event_topic["Status"],
created_date_time=event_topic[
"CreatedDateTime"
],
)
)
self.directories[directory.id].event_topics = event_topics
except ClientError as error:
if (
"is in Deleting state"
in error.response["Error"]["Message"]
):
logger.warning(
f"{directory.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
else:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
@@ -202,6 +226,15 @@ class DirectoryService(AWSService):
"SnapshotLimits"
]["ManualSnapshotsLimitReached"],
)
except ClientError as error:
if "is in Deleting state" in error.response["Error"]["Message"]:
logger.warning(
f"{directory.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
else:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"

View File

@@ -1,4 +1,3 @@
import hashlib
from json import dumps
from prowler.lib.check.models import Check, Check_Report_AWS
@@ -25,16 +24,10 @@ class ecs_task_definitions_no_environment_secrets(Check):
if container.environment:
dump_env_vars = {}
original_env_vars = {}
original_env_vars = []
for env_var in container.environment:
dump_env_vars.update({env_var.name: env_var.value})
original_env_vars.update(
{
hashlib.sha1( # nosec B324 SHA1 is used here for non-security-critical unique identifiers
env_var.value.encode("utf-8")
).hexdigest(): env_var.name
}
)
original_env_vars.append(env_var.name)
env_data = dumps(dump_env_vars, indent=2)
detect_secrets_output = detect_secrets_scan(
@@ -47,7 +40,7 @@ class ecs_task_definitions_no_environment_secrets(Check):
if detect_secrets_output:
secrets_string = ", ".join(
[
f"{secret['type']} on the environment variable {original_env_vars[secret['hashed_secret']]}"
f"{secret['type']} on the environment variable {original_env_vars[secret['line_number'] - 2]}"
for secret in detect_secrets_output
]
)

View File

@@ -147,6 +147,12 @@ class ElastiCache(AWSService):
logger.warning(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except (
regional_client.exceptions.InvalidReplicationGroupStateFault
) as error:
logger.warning(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
@@ -163,6 +169,12 @@ class ElastiCache(AWSService):
logger.warning(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except (
regional_client.exceptions.InvalidReplicationGroupStateFault
) as error:
logger.warning(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"

View File

@@ -0,0 +1,32 @@
{
"Provider": "aws",
"CheckID": "kms_cmk_not_multi_region",
"CheckTitle": "AWS KMS customer managed keys should not be multi-Region",
"CheckType": [
"Data Protection"
],
"ServiceName": "kms",
"SubServiceName": "",
"ResourceIdTemplate": "arn:partition:kms:region:account-id:key/resource-id",
"Severity": "high",
"ResourceType": "AwsKmsKey",
"Description": "Ensure that AWS KMS customer managed keys (CMKs) are not multi-region to maintain strict data control and compliance with security best practices.",
"Risk": "Multi-region KMS keys can increase the risk of unauthorized access and data exposure, as managing access controls and auditing across multiple regions becomes more complex. This expanded attack surface may lead to compliance violations and data breaches.",
"RelatedUrl": "https://docs.aws.amazon.com/kms/latest/developerguide/multi-region-keys-overview.html#multi-region-concepts",
"Remediation": {
"Code": {
"CLI": "aws kms create-key --no-multi-region",
"NativeIaC": "",
"Other": "",
"Terraform": "resource \"aws_kms_key\" \"example\" { description = \"Single-region key\" multi_region = false }"
},
"Recommendation": {
"Text": "Identify and replace multi-region keys with single-region KMS keys to enhance security and access control.",
"Url": "https://docs.aws.amazon.com/kms/latest/developerguide/mrk-when-to-use.html"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": "Multi-region keys should be used only when absolutely necessary, such as for cross-region disaster recovery, and should be carefully managed with strict access controls."
}

View File

@@ -0,0 +1,25 @@
from typing import List
from prowler.lib.check.models import Check, Check_Report_AWS
from prowler.providers.aws.services.kms.kms_client import kms_client
class kms_cmk_not_multi_region(Check):
"""kms_cmk_not_multi_region verifies if a KMS key is multi-regional"""
def execute(self) -> List[Check_Report_AWS]:
findings = []
for key in kms_client.keys:
if key.manager == "CUSTOMER" and key.state == "Enabled":
report = Check_Report_AWS(metadata=self.metadata(), resource=key)
report.status = "PASS"
report.status_extended = f"KMS CMK {key.id} is a single-region key."
if key.multi_region:
report.status = "FAIL"
report.status_extended = f"KMS CMK {key.id} is a multi-region key."
findings.append(report)
return findings

View File

@@ -45,12 +45,18 @@ class KMS(AWSService):
logger.info("KMS - Describing Key...")
try:
for key in self.keys:
regional_client = self.regional_clients[key.region]
response = regional_client.describe_key(KeyId=key.id)
key.state = response["KeyMetadata"]["KeyState"]
key.origin = response["KeyMetadata"]["Origin"]
key.manager = response["KeyMetadata"]["KeyManager"]
key.spec = response["KeyMetadata"]["CustomerMasterKeySpec"]
try:
regional_client = self.regional_clients[key.region]
response = regional_client.describe_key(KeyId=key.id)
key.state = response["KeyMetadata"]["KeyState"]
key.origin = response["KeyMetadata"]["Origin"]
key.manager = response["KeyMetadata"]["KeyManager"]
key.spec = response["KeyMetadata"]["CustomerMasterKeySpec"]
key.multi_region = response["KeyMetadata"]["MultiRegion"]
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}:{error.__traceback__.tb_lineno} -- {error}"
)
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}:{error.__traceback__.tb_lineno} -- {error}"
@@ -121,4 +127,5 @@ class Key(BaseModel):
policy: Optional[dict]
spec: Optional[str]
region: str
multi_region: Optional[bool]
tags: Optional[list] = []

View File

@@ -15,7 +15,7 @@ class entra_security_defaults_enabled(Check):
)
report.subscription = f"Tenant: {tenant}"
report.status = "FAIL"
report.status_extended = "Entra security defaults is diabled."
report.status_extended = "Entra security defaults is disabled."
if getattr(security_default, "is_enabled", False):
report.status = "PASS"

View File

@@ -211,6 +211,17 @@ class Provider(ABC):
mutelist_path=arguments.mutelist_file,
fixer_config=fixer_config,
)
elif "microsoft365" in provider_class_name.lower():
provider_class(
region=arguments.region,
config_path=arguments.config_file,
mutelist_path=arguments.mutelist_file,
sp_env_auth=arguments.sp_env_auth,
az_cli_auth=arguments.az_cli_auth,
browser_auth=arguments.browser_auth,
tenant_id=arguments.tenant_id,
fixer_config=fixer_config,
)
except TypeError as error:
logger.critical(

View File

@@ -1,5 +1,5 @@
{
"Provider": "compute",
"Provider": "gcp",
"CheckID": "compute_public_address_shodan",
"CheckTitle": "Check if any of the Public Addresses are in Shodan (requires Shodan API KEY).",
"CheckType": [

View File

@@ -95,7 +95,7 @@ class Rbac(KubernetesService):
"resources": rule.resources,
"verbs": rule.verbs,
}
for rule in role.rules
for rule in (role.rules or [])
],
}
roles[role.metadata.uid] = Role(**formatted_role)

View File

@@ -0,0 +1,279 @@
from prowler.exceptions.exceptions import ProwlerException
# Exceptions codes from 5000 to 5999 are reserved for Microsoft365 exceptions
class Microsoft365BaseException(ProwlerException):
"""Base class for Microsoft365 Errors."""
MICROSOFT365_ERROR_CODES = {
(6000, "Microsoft365EnvironmentVariableError"): {
"message": "Microsoft365 environment variable error",
"remediation": "Check the Microsoft365 environment variables and ensure they are properly set.",
},
(6001, "Microsoft365ArgumentTypeValidationError"): {
"message": "Microsoft365 argument type validation error",
"remediation": "Check the provided argument types specific to Microsoft365 and ensure they meet the required format.",
},
(6002, "Microsoft365SetUpRegionConfigError"): {
"message": "Microsoft365 region configuration setup error",
"remediation": "Check the Microsoft365 region configuration and ensure it is properly set up.",
},
(6003, "Microsoft365HTTPResponseError"): {
"message": "Error in HTTP response from Microsoft365",
"remediation": "",
},
(6004, "Microsoft365CredentialsUnavailableError"): {
"message": "Error trying to configure Microsoft365 credentials because they are unavailable",
"remediation": "Check the dictionary and ensure it is properly set up for Microsoft365 credentials. TENANT_ID, CLIENT_ID and CLIENT_SECRET are required.",
},
(6005, "Microsoft365GetTokenIdentityError"): {
"message": "Error trying to get token from Microsoft365 Identity",
"remediation": "Check the Microsoft365 Identity and ensure it is properly set up.",
},
(6006, "Microsoft365ClientAuthenticationError"): {
"message": "Error in client authentication",
"remediation": "Check the client authentication and ensure it is properly set up.",
},
(6007, "Microsoft365NotValidTenantIdError"): {
"message": "The provided tenant ID is not valid",
"remediation": "Check the tenant ID and ensure it is a valid ID.",
},
(6008, "Microsoft365NotValidClientIdError"): {
"message": "The provided client ID is not valid",
"remediation": "Check the client ID and ensure it is a valid ID.",
},
(6009, "Microsoft365NotValidClientSecretError"): {
"message": "The provided client secret is not valid",
"remediation": "Check the client secret and ensure it is a valid secret.",
},
(6010, "Microsoft365ConfigCredentialsError"): {
"message": "Error in configuration of Microsoft365 credentials",
"remediation": "Check the configuration of Microsoft365 credentials and ensure it is properly set up.",
},
(6011, "Microsoft365ClientIdAndClientSecretNotBelongingToTenantIdError"): {
"message": "The provided client ID and client secret do not belong to the provided tenant ID",
"remediation": "Check the client ID and client secret and ensure they belong to the provided tenant ID.",
},
(6012, "Microsoft365TenantIdAndClientSecretNotBelongingToClientIdError"): {
"message": "The provided tenant ID and client secret do not belong to the provided client ID",
"remediation": "Check the tenant ID and client secret and ensure they belong to the provided client ID.",
},
(6013, "Microsoft365TenantIdAndClientIdNotBelongingToClientSecretError"): {
"message": "The provided tenant ID and client ID do not belong to the provided client secret",
"remediation": "Check the tenant ID and client ID and ensure they belong to the provided client secret.",
},
(6014, "Microsoft365InvalidProviderIdError"): {
"message": "The provided provider_id does not match with the available subscriptions",
"remediation": "Check the provider_id and ensure it is a valid subscription for the given credentials.",
},
(6015, "Microsoft365NoAuthenticationMethodError"): {
"message": "No Microsoft365 authentication method found",
"remediation": "Check that any authentication method is properly set up for Microsoft365.",
},
(6016, "Microsoft365SetUpSessionError"): {
"message": "Error setting up session",
"remediation": "Check the session setup and ensure it is properly set up.",
},
(6017, "Microsoft365DefaultAzureCredentialError"): {
"message": "Error with DefaultAzureCredential",
"remediation": "Ensure DefaultAzureCredential is correctly configured.",
},
(6018, "Microsoft365InteractiveBrowserCredentialError"): {
"message": "Error with InteractiveBrowserCredential",
"remediation": "Ensure InteractiveBrowserCredential is correctly configured.",
},
(6019, "Microsoft365BrowserAuthNoTenantIDError"): {
"message": "Microsoft365 Tenant ID (--tenant-id) is required for browser authentication mode",
"remediation": "Check the Microsoft365 Tenant ID and ensure it is properly set up.",
},
(6020, "Microsoft365BrowserAuthNoFlagError"): {
"message": "Microsoft365 tenant ID error: browser authentication flag (--browser-auth) not found",
"remediation": "To use browser authentication, ensure the tenant ID is properly set.",
},
(6021, "Microsoft365NotTenantIdButClientIdAndClienSecretError"): {
"message": "Tenant Id is required for Microsoft365 static credentials. Make sure you are using the correct credentials.",
"remediation": "Check the Microsoft365 Tenant ID and ensure it is properly set up.",
},
}
def __init__(self, code, file=None, original_exception=None, message=None):
provider = "Microsoft365"
error_info = self.MICROSOFT365_ERROR_CODES.get((code, self.__class__.__name__))
if message:
error_info["message"] = message
super().__init__(
code=code,
source=provider,
file=file,
original_exception=original_exception,
error_info=error_info,
)
class Microsoft365CredentialsError(Microsoft365BaseException):
"""Base class for Microsoft365 credentials errors."""
def __init__(self, code, file=None, original_exception=None, message=None):
super().__init__(code, file, original_exception, message)
class Microsoft365EnvironmentVariableError(Microsoft365CredentialsError):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6000, file=file, original_exception=original_exception, message=message
)
class Microsoft365ArgumentTypeValidationError(Microsoft365BaseException):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6001, file=file, original_exception=original_exception, message=message
)
class Microsoft365SetUpRegionConfigError(Microsoft365BaseException):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6002, file=file, original_exception=original_exception, message=message
)
class Microsoft365HTTPResponseError(Microsoft365BaseException):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6003, file=file, original_exception=original_exception, message=message
)
class Microsoft365CredentialsUnavailableError(Microsoft365CredentialsError):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6004, file=file, original_exception=original_exception, message=message
)
class Microsoft365GetTokenIdentityError(Microsoft365BaseException):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6005, file=file, original_exception=original_exception, message=message
)
class Microsoft365ClientAuthenticationError(Microsoft365CredentialsError):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6006, file=file, original_exception=original_exception, message=message
)
class Microsoft365NotValidTenantIdError(Microsoft365CredentialsError):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6007, file=file, original_exception=original_exception, message=message
)
class Microsoft365NotValidClientIdError(Microsoft365CredentialsError):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6008, file=file, original_exception=original_exception, message=message
)
class Microsoft365NotValidClientSecretError(Microsoft365CredentialsError):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6009, file=file, original_exception=original_exception, message=message
)
class Microsoft365ConfigCredentialsError(Microsoft365CredentialsError):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6010, file=file, original_exception=original_exception, message=message
)
class Microsoft365ClientIdAndClientSecretNotBelongingToTenantIdError(
Microsoft365CredentialsError
):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6011, file=file, original_exception=original_exception, message=message
)
class Microsoft365TenantIdAndClientSecretNotBelongingToClientIdError(
Microsoft365CredentialsError
):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6012, file=file, original_exception=original_exception, message=message
)
class Microsoft365TenantIdAndClientIdNotBelongingToClientSecretError(
Microsoft365CredentialsError
):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6013, file=file, original_exception=original_exception, message=message
)
class Microsoft365InvalidProviderIdError(Microsoft365BaseException):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6014, file=file, original_exception=original_exception, message=message
)
class Microsoft365NoAuthenticationMethodError(Microsoft365CredentialsError):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6015, file=file, original_exception=original_exception, message=message
)
class Microsoft365SetUpSessionError(Microsoft365CredentialsError):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6016, file=file, original_exception=original_exception, message=message
)
class Microsoft365DefaultAzureCredentialError(Microsoft365CredentialsError):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6017, file=file, original_exception=original_exception, message=message
)
class Microsoft365InteractiveBrowserCredentialError(Microsoft365CredentialsError):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6018, file=file, original_exception=original_exception, message=message
)
class Microsoft365BrowserAuthNoTenantIDError(Microsoft365CredentialsError):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6019, file=file, original_exception=original_exception, message=message
)
class Microsoft365BrowserAuthNoFlagError(Microsoft365CredentialsError):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6020, file=file, original_exception=original_exception, message=message
)
class Microsoft365NotTenantIdButClientIdAndClienSecretError(
Microsoft365CredentialsError
):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6021, file=file, original_exception=original_exception, message=message
)

View File

@@ -0,0 +1,48 @@
def init_parser(self):
"""Init the Microsoft365 Provider CLI parser"""
microsoft365_parser = self.subparsers.add_parser(
"microsoft365",
parents=[self.common_providers_parser],
help="Microsoft365 Provider",
)
# Authentication Modes
microsoft365_auth_subparser = microsoft365_parser.add_argument_group(
"Authentication Modes"
)
microsoft365_auth_modes_group = (
microsoft365_auth_subparser.add_mutually_exclusive_group()
)
microsoft365_auth_modes_group.add_argument(
"--az-cli-auth",
action="store_true",
help="Use Azure CLI authentication to log in against Microsoft365",
)
microsoft365_auth_modes_group.add_argument(
"--sp-env-auth",
action="store_true",
help="Use Azure Service Principal environment variables authentication to log in against Microsoft365",
)
microsoft365_auth_modes_group.add_argument(
"--browser-auth",
action="store_true",
help="Use Azure interactive browser authentication to log in against Microsoft365",
)
microsoft365_parser.add_argument(
"--tenant-id",
nargs="?",
default=None,
help="Microsoft365 Tenant ID to be used with --browser-auth option",
)
# Regions
microsoft365_regions_subparser = microsoft365_parser.add_argument_group("Regions")
microsoft365_regions_subparser.add_argument(
"--region",
nargs="?",
default="Microsoft365Global",
choices=[
"Microsoft365Global",
"Microsoft365GlobalChina",
"Microsoft365USGovernment",
],
help="Microsoft365 region to be used, default is Microsoft365Global",
)

View File

@@ -0,0 +1,17 @@
from prowler.lib.check.models import Check_Report_Microsoft365
from prowler.lib.mutelist.mutelist import Mutelist
from prowler.lib.outputs.utils import unroll_dict, unroll_tags
class Microsoft365Mutelist(Mutelist):
def is_finding_muted(
self,
finding: Check_Report_Microsoft365,
) -> bool:
return self.is_muted(
finding.tenant_id,
finding.check_metadata.CheckID,
finding.location,
finding.resource_name,
unroll_dict(unroll_tags(finding.resource_tags)),
)

View File

@@ -0,0 +1,27 @@
from azure.identity import AzureAuthorityHosts
MICROSOFT365_CHINA_CLOUD = "https://microsoftgraph.chinacloudapi.cn"
MICROSOFT365_US_GOV_CLOUD = "https://graph.microsoft.us"
MICROSOFT365_US_DOD_CLOUD = "https://graph.microsoftmil.us"
MICROSOFT365_GENERIC_CLOUD = "https://graph.microsoft.com"
def get_regions_config(region):
allowed_regions = {
"Microsoft365Global": {
"authority": None,
"base_url": MICROSOFT365_GENERIC_CLOUD,
"credential_scopes": [MICROSOFT365_GENERIC_CLOUD + "/.default"],
},
"Microsoft365China": {
"authority": AzureAuthorityHosts.AZURE_CHINA,
"base_url": MICROSOFT365_CHINA_CLOUD,
"credential_scopes": [MICROSOFT365_CHINA_CLOUD + "/.default"],
},
"Microsoft365USGovernment": {
"authority": AzureAuthorityHosts.AZURE_GOVERNMENT,
"base_url": MICROSOFT365_US_GOV_CLOUD,
"credential_scopes": [MICROSOFT365_US_GOV_CLOUD + "/.default"],
},
}
return allowed_regions[region]

View File

@@ -0,0 +1,14 @@
from msgraph import GraphServiceClient
from prowler.providers.microsoft365.microsoft365_provider import Microsoft365Provider
class Microsoft365Service:
def __init__(
self,
provider: Microsoft365Provider,
):
self.client = GraphServiceClient(credentials=provider.session)
self.audit_config = provider.audit_config
self.fixer_config = provider.fixer_config

View File

@@ -0,0 +1,923 @@
import asyncio
import os
import re
from argparse import ArgumentTypeError
from os import getenv
from uuid import UUID
from azure.core.exceptions import ClientAuthenticationError, HttpResponseError
from azure.identity import (
ClientSecretCredential,
CredentialUnavailableError,
DefaultAzureCredential,
InteractiveBrowserCredential,
)
from colorama import Fore, Style
from msal import ConfidentialClientApplication
from msgraph import GraphServiceClient
from prowler.config.config import (
default_config_file_path,
get_default_mute_file_path,
load_and_validate_config_file,
)
from prowler.lib.logger import logger
from prowler.lib.utils.utils import print_boxes
from prowler.providers.common.models import Audit_Metadata, Connection
from prowler.providers.common.provider import Provider
from prowler.providers.microsoft365.exceptions.exceptions import (
Microsoft365ArgumentTypeValidationError,
Microsoft365BrowserAuthNoFlagError,
Microsoft365BrowserAuthNoTenantIDError,
Microsoft365ClientAuthenticationError,
Microsoft365ClientIdAndClientSecretNotBelongingToTenantIdError,
Microsoft365ConfigCredentialsError,
Microsoft365CredentialsUnavailableError,
Microsoft365DefaultAzureCredentialError,
Microsoft365EnvironmentVariableError,
Microsoft365GetTokenIdentityError,
Microsoft365HTTPResponseError,
Microsoft365InteractiveBrowserCredentialError,
Microsoft365InvalidProviderIdError,
Microsoft365NoAuthenticationMethodError,
Microsoft365NotTenantIdButClientIdAndClienSecretError,
Microsoft365NotValidClientIdError,
Microsoft365NotValidClientSecretError,
Microsoft365NotValidTenantIdError,
Microsoft365SetUpRegionConfigError,
Microsoft365SetUpSessionError,
Microsoft365TenantIdAndClientIdNotBelongingToClientSecretError,
Microsoft365TenantIdAndClientSecretNotBelongingToClientIdError,
)
from prowler.providers.microsoft365.lib.mutelist.mutelist import Microsoft365Mutelist
from prowler.providers.microsoft365.lib.regions.regions import get_regions_config
from prowler.providers.microsoft365.models import (
Microsoft365IdentityInfo,
Microsoft365RegionConfig,
)
class Microsoft365Provider(Provider):
"""
Represents an Microsoft365 provider.
This class provides functionality to interact with the Microsoft365 resources.
It handles authentication, region configuration, and provides access to various properties and methods
related to the Microsoft365 provider.
Attributes:
_type (str): The type of the provider, which is set to "microsoft365".
_session (DefaultMicrosoft365Credential): The session object associated with the Microsoft365 provider.
_identity (Microsoft365IdentityInfo): The identity information for the Microsoft365 provider.
_audit_config (dict): The audit configuration for the Microsoft365 provider.
_region_config (Microsoft365RegionConfig): The region configuration for the Microsoft365 provider.
_mutelist (Microsoft365Mutelist): The mutelist object associated with the Microsoft365 provider.
audit_metadata (Audit_Metadata): The audit metadata for the Microsoft365 provider.
Methods:
__init__ -> Initializes the Microsoft365 provider.
identity(self): Returns the identity of the Microsoft365 provider.
type(self): Returns the type of the Microsoft365 provider.
session(self): Returns the session object associated with the Microsoft365 provider.
region_config(self): Returns the region configuration for the Microsoft365 provider.
audit_config(self): Returns the audit configuration for the Microsoft365 provider.
fixer_config(self): Returns the fixer configuration.
output_options(self, options: tuple): Sets the output options for the Microsoft365 provider.
mutelist(self) -> Microsoft365Mutelist: Returns the mutelist object associated with the Microsoft365 provider.
setup_region_config(cls, region): Sets up the region configuration for the Microsoft365 provider.
print_credentials(self): Prints the Microsoft365 credentials information.
setup_session(cls, az_cli_auth, app_env_auth, browser_auth, managed_identity_auth, tenant_id, region_config): Set up the Microsoft365 session with the specified authentication method.
"""
_type: str = "microsoft365"
_session: DefaultAzureCredential # Must be used besides being named for Azure
_identity: Microsoft365IdentityInfo
_audit_config: dict
_region_config: Microsoft365RegionConfig
_mutelist: Microsoft365Mutelist
# TODO: this is not optional, enforce for all providers
audit_metadata: Audit_Metadata
def __init__(
self,
sp_env_auth: bool,
az_cli_auth: bool,
browser_auth: bool,
tenant_id: str = None,
client_id: str = None,
client_secret: str = None,
region: str = "Microsoft365Global",
config_content: dict = None,
config_path: str = None,
mutelist_path: str = None,
mutelist_content: dict = None,
fixer_config: dict = {},
):
"""
Initializes the Microsoft365 provider.
Args:
tenant_id (str): The Microsoft365 Active Directory tenant ID.
region (str): The Microsoft365 region.
client_id (str): The Microsoft365 client ID.
client_secret (str): The Microsoft365 client secret.
config_path (str): The path to the configuration file.
config_content (dict): The configuration content.
fixer_config (dict): The fixer configuration.
mutelist_path (str): The path to the mutelist file.
mutelist_content (dict): The mutelist content.
Returns:
None
Raises:
Microsoft365ArgumentTypeValidationError: If there is an error in the argument type validation.
Microsoft365SetUpRegionConfigError: If there is an error in setting up the region configuration.
Microsoft365ConfigCredentialsError: If there is an error in configuring the Microsoft365 credentials from a dictionary.
Microsoft365GetTokenIdentityError: If there is an error in getting the token from the Microsoft365 identity.
Microsoft365HTTPResponseError: If there is an HTTP response error.
"""
logger.info("Setting Microsoft365 provider ...")
logger.info("Checking if any credentials mode is set ...")
# Validate the authentication arguments
self.validate_arguments(
az_cli_auth,
sp_env_auth,
browser_auth,
tenant_id,
client_id,
client_secret,
)
logger.info("Checking if region is different than default one")
self._region_config = self.setup_region_config(region)
# Get the dict from the static credentials
microsoft365_credentials = None
if tenant_id and client_id and client_secret:
microsoft365_credentials = self.validate_static_credentials(
tenant_id=tenant_id, client_id=client_id, client_secret=client_secret
)
# Set up the Microsoft365 session
self._session = self.setup_session(
az_cli_auth,
sp_env_auth,
browser_auth,
tenant_id,
microsoft365_credentials,
self._region_config,
)
# Set up the identity
self._identity = self.setup_identity(
az_cli_auth,
sp_env_auth,
browser_auth,
client_id,
)
# Audit Config
if config_content:
self._audit_config = config_content
else:
if not config_path:
config_path = default_config_file_path
self._audit_config = load_and_validate_config_file(self._type, config_path)
# Fixer Config
self._fixer_config = fixer_config
# Mutelist
if mutelist_content:
self._mutelist = Microsoft365Mutelist(
mutelist_content=mutelist_content,
)
else:
if not mutelist_path:
mutelist_path = get_default_mute_file_path(self.type)
self._mutelist = Microsoft365Mutelist(
mutelist_path=mutelist_path,
)
Provider.set_global_provider(self)
@property
def identity(self):
"""Returns the identity of the Microsoft365 provider."""
return self._identity
@property
def type(self):
"""Returns the type of the Microsoft365 provider."""
return self._type
@property
def session(self):
"""Returns the session object associated with the Microsoft365 provider."""
return self._session
@property
def region_config(self):
"""Returns the region configuration for the Microsoft365 provider."""
return self._region_config
@property
def audit_config(self):
"""Returns the audit configuration for the Microsoft365 provider."""
return self._audit_config
@property
def fixer_config(self):
"""Returns the fixer configuration."""
return self._fixer_config
@property
def mutelist(self) -> Microsoft365Mutelist:
"""Mutelist object associated with this Microsoft365 provider."""
return self._mutelist
@staticmethod
def validate_arguments(
az_cli_auth: bool,
sp_env_auth: bool,
browser_auth: bool,
tenant_id: str,
client_id: str,
client_secret: str,
):
"""
Validates the authentication arguments for the Microsoft365 provider.
Args:
az_cli_auth (bool): Flag indicating whether Azure CLI authentication is enabled.
sp_env_auth (bool): Flag indicating whether application authentication with environment variables is enabled.
browser_auth (bool): Flag indicating whether browser authentication is enabled.
tenant_id (str): The Microsoft365 Tenant ID.
client_id (str): The Microsoft365 Client ID.
client_secret (str): The Microsoft365 Client Secret.
Raises:
Microsoft365BrowserAuthNoTenantIDError: If browser authentication is enabled but the tenant ID is not found.
"""
if not client_id and not client_secret:
if not browser_auth and tenant_id:
raise Microsoft365BrowserAuthNoFlagError(
file=os.path.basename(__file__),
message="Microsoft365 tenant ID error: browser authentication flag (--browser-auth) not found",
)
elif not az_cli_auth and not sp_env_auth and not browser_auth:
raise Microsoft365NoAuthenticationMethodError(
file=os.path.basename(__file__),
message="Microsoft365 provider requires at least one authentication method set: [--az-cli-auth | --sp-env-auth | --browser-auth]",
)
elif browser_auth and not tenant_id:
raise Microsoft365BrowserAuthNoTenantIDError(
file=os.path.basename(__file__),
message="Microsoft365 Tenant ID (--tenant-id) is required for browser authentication mode",
)
else:
if not tenant_id:
raise Microsoft365NotTenantIdButClientIdAndClienSecretError(
file=os.path.basename(__file__),
message="Tenant Id is required for Microsoft365 static credentials. Make sure you are using the correct credentials.",
)
@staticmethod
def setup_region_config(region):
"""
Sets up the region configuration for the Microsoft365 provider.
Args:
region (str): The name of the region.
Returns:
Microsoft365RegionConfig: The region configuration object.
"""
try:
config = get_regions_config(region)
return Microsoft365RegionConfig(
name=region,
authority=config["authority"],
base_url=config["base_url"],
credential_scopes=config["credential_scopes"],
)
except ArgumentTypeError as validation_error:
logger.error(
f"{validation_error.__class__.__name__}[{validation_error.__traceback__.tb_lineno}]: {validation_error}"
)
raise Microsoft365ArgumentTypeValidationError(
file=os.path.basename(__file__),
original_exception=validation_error,
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
raise Microsoft365SetUpRegionConfigError(
file=os.path.basename(__file__),
original_exception=error,
)
def print_credentials(self):
"""Microsoft365 credentials information.
This method prints the Microsoft365 Tenant Domain, Microsoft365 Tenant ID, Microsoft365 Region,
Microsoft365 Subscriptions, Microsoft365 Identity Type, and Microsoft365 Identity ID.
Args:
None
Returns:
None
"""
report_lines = [
f"Microsoft365 Region: {Fore.YELLOW}{self.region_config.name}{Style.RESET_ALL}",
f"Microsoft365 Tenant Domain: {Fore.YELLOW}{self._identity.tenant_domain}{Style.RESET_ALL} Microsoft365 Tenant ID: {Fore.YELLOW}{self._identity.tenant_id}{Style.RESET_ALL}",
f"Microsoft365 Identity Type: {Fore.YELLOW}{self._identity.identity_type}{Style.RESET_ALL} Microsoft365 Identity ID: {Fore.YELLOW}{self._identity.identity_id}{Style.RESET_ALL}",
]
report_title = (
f"{Style.BRIGHT}Using the Microsoft365 credentials below:{Style.RESET_ALL}"
)
print_boxes(report_lines, report_title)
# TODO: setup_session or setup_credentials?
# This should be setup_credentials, since it is setting up the credentials for the provider
@staticmethod
def setup_session(
az_cli_auth: bool,
sp_env_auth: bool,
browser_auth: bool,
tenant_id: str,
microsoft365_credentials: dict,
region_config: Microsoft365RegionConfig,
):
"""Returns the Microsoft365 credentials object.
Set up the Microsoft365 session with the specified authentication method.
Args:
az_cli_auth (bool): Flag indicating whether to use Azure CLI authentication.
sp_env_auth (bool): Flag indicating whether to use application authentication with environment variables.
browser_auth (bool): Flag indicating whether to use interactive browser authentication.
tenant_id (str): The Microsoft365 Active Directory tenant ID.
microsoft365_credentials (dict): The Microsoft365 configuration object. It contains the following keys:
- tenant_id: The Microsoft365 Active Directory tenant ID.
- client_id: The Microsoft365 client ID.
- client_secret: The Microsoft365 client secret
region_config (Microsoft365RegionConfig): The region configuration object.
Returns:
credentials: The Microsoft365 credentials object.
Raises:
Exception: If failed to retrieve Microsoft365 credentials.
"""
if not browser_auth:
if sp_env_auth:
try:
Microsoft365Provider.check_service_principal_creds_env_vars()
except (
Microsoft365EnvironmentVariableError
) as environment_credentials_error:
logger.critical(
f"{environment_credentials_error.__class__.__name__}[{environment_credentials_error.__traceback__.tb_lineno}] -- {environment_credentials_error}"
)
raise environment_credentials_error
try:
if microsoft365_credentials:
try:
credentials = ClientSecretCredential(
tenant_id=microsoft365_credentials["tenant_id"],
client_id=microsoft365_credentials["client_id"],
client_secret=microsoft365_credentials["client_secret"],
)
return credentials
except ClientAuthenticationError as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
raise Microsoft365ClientAuthenticationError(
file=os.path.basename(__file__), original_exception=error
)
except CredentialUnavailableError as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
raise Microsoft365CredentialsUnavailableError(
file=os.path.basename(__file__), original_exception=error
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
raise Microsoft365ConfigCredentialsError(
file=os.path.basename(__file__), original_exception=error
)
else:
# Since the authentication method to be used will come as True, we have to negate it since
# DefaultAzureCredential sets just one authentication method, excluding the others
try:
credentials = DefaultAzureCredential(
exclude_environment_credential=not sp_env_auth,
exclude_cli_credential=not az_cli_auth,
# Microsoft365 Auth using Managed Identity is not supported
exclude_managed_identity_credential=True,
# Microsoft365 Auth using Visual Studio is not supported
exclude_visual_studio_code_credential=True,
# Microsoft365 Auth using Shared Token Cache is not supported
exclude_shared_token_cache_credential=True,
# Microsoft365 Auth using PowerShell is not supported
exclude_powershell_credential=True,
# set Authority of a Microsoft Entra endpoint
authority=region_config.authority,
)
except ClientAuthenticationError as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
raise Microsoft365ClientAuthenticationError(
file=os.path.basename(__file__), original_exception=error
)
except CredentialUnavailableError as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
raise Microsoft365CredentialsUnavailableError(
file=os.path.basename(__file__), original_exception=error
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
raise Microsoft365DefaultAzureCredentialError(
file=os.path.basename(__file__), original_exception=error
)
except Exception as error:
logger.critical("Failed to retrieve Microsoft365 credentials")
logger.critical(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
raise Microsoft365SetUpSessionError(
file=os.path.basename(__file__), original_exception=error
)
else:
try:
credentials = InteractiveBrowserCredential(tenant_id=tenant_id)
except Exception as error:
logger.critical(
"Failed to retrieve Microsoft365 credentials using browser authentication"
)
logger.critical(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
raise Microsoft365InteractiveBrowserCredentialError(
file=os.path.basename(__file__), original_exception=error
)
return credentials
@staticmethod
def test_connection(
az_cli_auth: bool = False,
sp_env_auth: bool = False,
browser_auth: bool = False,
tenant_id: str = None,
region: str = "Microsoft365Global",
raise_on_exception=True,
client_id=None,
client_secret=None,
) -> Connection:
"""Test connection to Microsoft365 subscription.
Test the connection to an Microsoft365 subscription using the provided credentials.
Args:
az_cli_auth (bool): Flag indicating whether to use Azure CLI authentication.
sp_env_auth (bool): Flag indicating whether to use application authentication with environment variables.
browser_auth (bool): Flag indicating whether to use interactive browser authentication.
tenant_id (str): The Microsoft365 Active Directory tenant ID.
region (str): The Microsoft365 region.
raise_on_exception (bool): Flag indicating whether to raise an exception if the connection fails.
client_id (str): The Microsoft365 client ID.
client_secret (str): The Microsoft365 client secret.
Returns:
bool: True if the connection is successful, False otherwise.
Raises:
Exception: If failed to test the connection to Microsoft365 subscription.
Microsoft365ArgumentTypeValidationError: If there is an error in the argument type validation.
Microsoft365SetUpRegionConfigError: If there is an error in setting up the region configuration.
Microsoft365InteractiveBrowserCredentialError: If there is an error in retrieving the Microsoft365 credentials using browser authentication.
Microsoft365HTTPResponseError: If there is an HTTP response error.
Microsoft365ConfigCredentialsError: If there is an error in configuring the Microsoft365 credentials from a dictionary.
Examples:
>>> Microsoft365Provider.test_connection(az_cli_auth=True)
True
>>> Microsoft365Provider.test_connection(sp_env_auth=False, browser_auth=True, tenant_id=None)
False, ArgumentTypeError: Microsoft365 Tenant ID is required only for browser authentication mode
>>> Microsoft365Provider.test_connection(tenant_id="XXXXXXXXXX", client_id="XXXXXXXXXX", client_secret="XXXXXXXXXX")
True
"""
try:
Microsoft365Provider.validate_arguments(
az_cli_auth,
sp_env_auth,
browser_auth,
tenant_id,
client_id,
client_secret,
)
region_config = Microsoft365Provider.setup_region_config(region)
# Get the dict from the static credentials
microsoft365_credentials = None
if tenant_id and client_id and client_secret:
microsoft365_credentials = (
Microsoft365Provider.validate_static_credentials(
tenant_id=tenant_id,
client_id=client_id,
client_secret=client_secret,
)
)
# Set up the Microsoft365 session
credentials = Microsoft365Provider.setup_session(
az_cli_auth,
sp_env_auth,
browser_auth,
tenant_id,
microsoft365_credentials,
region_config,
)
GraphServiceClient(credentials=credentials)
logger.info("Microsoft365 provider: Connection to Microsoft365 successful")
return Connection(is_connected=True)
# Exceptions from setup_region_config
except Microsoft365ArgumentTypeValidationError as type_validation_error:
logger.error(
f"{type_validation_error.__class__.__name__}[{type_validation_error.__traceback__.tb_lineno}]: {type_validation_error}"
)
if raise_on_exception:
raise type_validation_error
return Connection(error=type_validation_error)
except Microsoft365SetUpRegionConfigError as region_config_error:
logger.error(
f"{region_config_error.__class__.__name__}[{region_config_error.__traceback__.tb_lineno}]: {region_config_error}"
)
if raise_on_exception:
raise region_config_error
return Connection(error=region_config_error)
# Exceptions from setup_session
except Microsoft365EnvironmentVariableError as environment_credentials_error:
logger.error(
f"{environment_credentials_error.__class__.__name__}[{environment_credentials_error.__traceback__.tb_lineno}]: {environment_credentials_error}"
)
if raise_on_exception:
raise environment_credentials_error
return Connection(error=environment_credentials_error)
except Microsoft365ConfigCredentialsError as config_credentials_error:
logger.error(
f"{config_credentials_error.__class__.__name__}[{config_credentials_error.__traceback__.tb_lineno}]: {config_credentials_error}"
)
if raise_on_exception:
raise config_credentials_error
return Connection(error=config_credentials_error)
except Microsoft365ClientAuthenticationError as client_auth_error:
logger.error(
f"{client_auth_error.__class__.__name__}[{client_auth_error.__traceback__.tb_lineno}]: {client_auth_error}"
)
if raise_on_exception:
raise client_auth_error
return Connection(error=client_auth_error)
except Microsoft365CredentialsUnavailableError as credential_unavailable_error:
logger.error(
f"{credential_unavailable_error.__class__.__name__}[{credential_unavailable_error.__traceback__.tb_lineno}]: {credential_unavailable_error}"
)
if raise_on_exception:
raise credential_unavailable_error
return Connection(error=credential_unavailable_error)
except (
Microsoft365ClientIdAndClientSecretNotBelongingToTenantIdError
) as tenant_id_error:
logger.error(
f"{tenant_id_error.__class__.__name__}[{tenant_id_error.__traceback__.tb_lineno}]: {tenant_id_error}"
)
if raise_on_exception:
raise tenant_id_error
return Connection(error=tenant_id_error)
except (
Microsoft365TenantIdAndClientSecretNotBelongingToClientIdError
) as client_id_error:
logger.error(
f"{client_id_error.__class__.__name__}[{client_id_error.__traceback__.tb_lineno}]: {client_id_error}"
)
if raise_on_exception:
raise client_id_error
return Connection(error=client_id_error)
except (
Microsoft365TenantIdAndClientIdNotBelongingToClientSecretError
) as client_secret_error:
logger.error(
f"{client_secret_error.__class__.__name__}[{client_secret_error.__traceback__.tb_lineno}]: {client_secret_error}"
)
if raise_on_exception:
raise client_secret_error
return Connection(error=client_secret_error)
# Exceptions from provider_id validation
except Microsoft365InvalidProviderIdError as invalid_credentials_error:
logger.error(
f"{invalid_credentials_error.__class__.__name__}[{invalid_credentials_error.__traceback__.tb_lineno}]: {invalid_credentials_error}"
)
if raise_on_exception:
raise invalid_credentials_error
return Connection(error=invalid_credentials_error)
# Exceptions from SubscriptionClient
except HttpResponseError as http_response_error:
logger.error(
f"{http_response_error.__class__.__name__}[{http_response_error.__traceback__.tb_lineno}]: {http_response_error}"
)
if raise_on_exception:
raise Microsoft365HTTPResponseError(
file=os.path.basename(__file__),
original_exception=http_response_error,
)
return Connection(error=http_response_error)
except Exception as error:
logger.critical(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
if raise_on_exception:
# Raise directly the exception
raise error
return Connection(error=error)
@staticmethod
def check_service_principal_creds_env_vars():
"""
Checks the presence of required environment variables for service principal authentication against Azure.
This method checks for the presence of the following environment variables:
- AZURE_CLIENT_ID: Azure client ID
- AZURE_TENANT_ID: Azure tenant ID
- AZURE_CLIENT_SECRET: Azure client secret
If any of the environment variables is missing, it logs a critical error and exits the program.
"""
logger.info(
"Microsoft365 provider: checking service principal environment variables ..."
)
for env_var in ["AZURE_CLIENT_ID", "AZURE_TENANT_ID", "AZURE_CLIENT_SECRET"]:
if not getenv(env_var):
logger.critical(
f"Microsoft365 provider: Missing environment variable {env_var} needed to authenticate against Microsoft365"
)
raise Microsoft365EnvironmentVariableError(
file=os.path.basename(__file__),
message=f"Missing environment variable {env_var} required to authenticate.",
)
def setup_identity(
self,
az_cli_auth,
sp_env_auth,
browser_auth,
client_id,
):
"""
Sets up the identity for the Microsoft365 provider.
Args:
az_cli_auth (bool): Flag indicating if Azure CLI authentication is used.
sp_env_auth (bool): Flag indicating if application authentication with environment variables is used.
browser_auth (bool): Flag indicating if interactive browser authentication is used.
client_id (str): The Microsoft365 client ID.
Returns:
Microsoft365IdentityInfo: An instance of Microsoft365IdentityInfo containing the identity information.
"""
credentials = self.session
# TODO: fill this object with real values not default and set to none
identity = Microsoft365IdentityInfo()
# If credentials comes from service principal or browser, if the required permissions are assigned
# the identity can access AAD and retrieve the tenant domain name.
# With cli also should be possible but right now it does not work, microsoft365 python package issue is coming
# At the time of writting this with az cli creds is not working, despite that is included
if az_cli_auth or sp_env_auth or browser_auth or client_id:
async def get_microsoft365_identity():
# Trying to recover tenant domain info
try:
logger.info(
"Trying to retrieve tenant domain from AAD to populate identity structure ..."
)
client = GraphServiceClient(credentials=credentials)
domain_result = await client.domains.get()
if getattr(domain_result, "value"):
if getattr(domain_result.value[0], "id"):
identity.tenant_domain = domain_result.value[0].id
except HttpResponseError as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
raise Microsoft365HTTPResponseError(
file=os.path.basename(__file__),
original_exception=error,
)
except ClientAuthenticationError as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
raise Microsoft365GetTokenIdentityError(
file=os.path.basename(__file__),
original_exception=error,
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
# since that exception is not considered as critical, we keep filling another identity fields
if sp_env_auth or client_id:
# The id of the sp can be retrieved from environment variables
identity.identity_id = getenv("AZURE_CLIENT_ID")
identity.identity_type = "Service Principal"
# Same here, if user can access AAD, some fields are retrieved if not, default value, for az cli
# should work but it doesn't, pending issue
else:
identity.identity_id = "Unknown user id (Missing AAD permissions)"
identity.identity_type = "User"
try:
logger.info(
"Trying to retrieve user information from AAD to populate identity structure ..."
)
client = GraphServiceClient(credentials=credentials)
me = await client.me.get()
if me:
if getattr(me, "user_principal_name"):
identity.identity_id = me.user_principal_name
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
# Retrieve tenant id from the client
client = GraphServiceClient(credentials=credentials)
organization_info = await client.organization.get()
identity.tenant_id = organization_info.value[0].id
asyncio.get_event_loop().run_until_complete(get_microsoft365_identity())
return identity
@staticmethod
def validate_static_credentials(
tenant_id: str = None, client_id: str = None, client_secret: str = None
) -> dict:
"""
Validates the static credentials for the Microsoft365 provider.
Args:
tenant_id (str): The Microsoft365 Active Directory tenant ID.
client_id (str): The Microsoft365 client ID.
client_secret (str): The Microsoft365 client secret.
Raises:
Microsoft365NotValidTenantIdError: If the provided Microsoft365 Tenant ID is not valid.
Microsoft365NotValidClientIdError: If the provided Microsoft365 Client ID is not valid.
Microsoft365NotValidClientSecretError: If the provided Microsoft365 Client Secret is not valid.
Microsoft365ClientIdAndClientSecretNotBelongingToTenantIdError: If the provided Microsoft365 Client ID and Client Secret do not belong to the specified Tenant ID.
Microsoft365TenantIdAndClientSecretNotBelongingToClientIdError: If the provided Microsoft365 Tenant ID and Client Secret do not belong to the specified Client ID.
Microsoft365TenantIdAndClientIdNotBelongingToClientSecretError: If the provided Microsoft365 Tenant ID and Client ID do not belong to the specified Client Secret.
Returns:
dict: A dictionary containing the validated static credentials.
"""
# Validate the Tenant ID
try:
UUID(tenant_id)
except ValueError:
raise Microsoft365NotValidTenantIdError(
file=os.path.basename(__file__),
message="The provided Microsoft365 Tenant ID is not valid.",
)
# Validate the Client ID
try:
UUID(client_id)
except ValueError:
raise Microsoft365NotValidClientIdError(
file=os.path.basename(__file__),
message="The provided Microsoft365 Client ID is not valid.",
)
# Validate the Client Secret
if not re.match("^[a-zA-Z0-9._~-]+$", client_secret):
raise Microsoft365NotValidClientSecretError(
file=os.path.basename(__file__),
message="The provided Microsoft365 Client Secret is not valid.",
)
try:
Microsoft365Provider.verify_client(tenant_id, client_id, client_secret)
return {
"tenant_id": tenant_id,
"client_id": client_id,
"client_secret": client_secret,
}
except Microsoft365NotValidTenantIdError as tenant_id_error:
logger.error(
f"{tenant_id_error.__class__.__name__}[{tenant_id_error.__traceback__.tb_lineno}]: {tenant_id_error}"
)
raise Microsoft365ClientIdAndClientSecretNotBelongingToTenantIdError(
file=os.path.basename(__file__),
message="The provided Microsoft365 Client ID and Client Secret do not belong to the specified Tenant ID.",
)
except Microsoft365NotValidClientIdError as client_id_error:
logger.error(
f"{client_id_error.__class__.__name__}[{client_id_error.__traceback__.tb_lineno}]: {client_id_error}"
)
raise Microsoft365TenantIdAndClientSecretNotBelongingToClientIdError(
file=os.path.basename(__file__),
message="The provided Microsoft365 Tenant ID and Client Secret do not belong to the specified Client ID.",
)
except Microsoft365NotValidClientSecretError as client_secret_error:
logger.error(
f"{client_secret_error.__class__.__name__}[{client_secret_error.__traceback__.tb_lineno}]: {client_secret_error}"
)
raise Microsoft365TenantIdAndClientIdNotBelongingToClientSecretError(
file=os.path.basename(__file__),
message="The provided Microsoft365 Tenant ID and Client ID do not belong to the specified Client Secret.",
)
@staticmethod
def verify_client(tenant_id, client_id, client_secret) -> None:
"""
Verifies the Microsoft365 client credentials using the specified tenant ID, client ID, and client secret.
Args:
tenant_id (str): The Microsoft365 Active Directory tenant ID.
client_id (str): The Microsoft365 client ID.
client_secret (str): The Microsoft365 client secret.
Raises:
Microsoft365NotValidTenantIdError: If the provided Microsoft365 Tenant ID is not valid.
Microsoft365NotValidClientIdError: If the provided Microsoft365 Client ID is not valid.
Microsoft365NotValidClientSecretError: If the provided Microsoft365 Client Secret is not valid.
Returns:
None
"""
authority = f"https://login.microsoftonline.com/{tenant_id}"
try:
# Create a ConfidentialClientApplication instance
app = ConfidentialClientApplication(
client_id=client_id,
client_credential=client_secret,
authority=authority,
)
# Attempt to acquire a token
result = app.acquire_token_for_client(
scopes=["https://graph.microsoft.com/.default"]
)
# Check if token acquisition was successful
if "access_token" not in result:
# Handle specific errors based on the MSAL response
error_description = result.get("error_description", "")
if f"Tenant '{tenant_id}'" in error_description:
raise Microsoft365NotValidTenantIdError(
file=os.path.basename(__file__),
message="The provided Microsoft 365 Tenant ID is not valid for the specified Client ID and Client Secret.",
)
if f"Application with identifier '{client_id}'" in error_description:
raise Microsoft365NotValidClientIdError(
file=os.path.basename(__file__),
message="The provided Microsoft 365 Client ID is not valid for the specified Tenant ID and Client Secret.",
)
if "Invalid client secret provided" in error_description:
raise Microsoft365NotValidClientSecretError(
file=os.path.basename(__file__),
message="The provided Microsoft 365 Client Secret is not valid for the specified Tenant ID and Client ID.",
)
except Exception as e:
# Generic exception handling (if needed)
raise RuntimeError(f"An unexpected error occurred: {str(e)}")

View File

@@ -0,0 +1,44 @@
from pydantic import BaseModel
from prowler.config.config import output_file_timestamp
from prowler.providers.common.models import ProviderOutputOptions
class Microsoft365IdentityInfo(BaseModel):
identity_id: str = ""
identity_type: str = ""
tenant_id: str = ""
tenant_domain: str = "Unknown tenant domain (missing AAD permissions)"
location: str = ""
class Microsoft365RegionConfig(BaseModel):
name: str = ""
authority: str = None
base_url: str = ""
credential_scopes: list = []
class Microsoft365OutputOptions(ProviderOutputOptions):
def __init__(self, arguments, bulk_checks_metadata, identity):
# First call Provider_Output_Options init
super().__init__(arguments, bulk_checks_metadata)
# Check if custom output filename was input, if not, set the default
if (
not hasattr(arguments, "output_filename")
or arguments.output_filename is None
):
if (
identity.tenant_domain
!= "Unknown tenant domain (missing AAD permissions)"
):
self.output_filename = (
f"prowler-output-{identity.tenant_domain}-{output_file_timestamp}"
)
else:
self.output_filename = (
f"prowler-output-{identity.tenant_id}-{output_file_timestamp}"
)
else:
self.output_filename = arguments.output_filename

View File

@@ -0,0 +1,6 @@
from prowler.providers.common.provider import Provider
from prowler.providers.microsoft365.services.admincenter.admincenter_service import (
AdminCenter,
)
admincenter_client = AdminCenter(Provider.get_global_provider())

View File

@@ -0,0 +1,30 @@
{
"Provider": "microsoft365",
"CheckID": "admincenter_groups_not_public_visibility",
"CheckTitle": "Ensure that only organizationally managed/approved public groups exist",
"CheckType": [],
"ServiceName": "admincenter",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "high",
"ResourceType": "Microsoft365Group",
"Description": "Ensure that only organizationally managed and approved public groups exist to prevent unauthorized access to sensitive group resources like SharePoint, Teams, or other shared assets.",
"Risk": "Unmanaged public groups can allow unauthorized access to organizational resources, posing a risk of data leakage or misuse through easily guessable SharePoint URLs or self-adding to groups via the Azure portal.",
"RelatedUrl": "https://learn.microsoft.com/en-us/microsoft-365/admin/create-groups/manage-groups?view=o365-worldwide",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "",
"Terraform": ""
},
"Recommendation": {
"Text": "Review and adjust the privacy settings of Microsoft 365 Groups to ensure only organizationally managed and approved public groups exist.",
"Url": "https://learn.microsoft.com/en-us/microsoft-365/security/office-365-security/microsoft-365-groups-governance"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""
}

View File

@@ -0,0 +1,25 @@
from prowler.lib.check.models import Check, Check_Report_Microsoft365
from prowler.providers.microsoft365.services.admincenter.admincenter_client import (
admincenter_client,
)
class admincenter_groups_not_public_visibility(Check):
def execute(self) -> Check_Report_Microsoft365:
findings = []
for group in admincenter_client.groups.values():
report = Check_Report_Microsoft365(metadata=self.metadata(), resource=group)
report.resource_id = group.id
report.resource_name = group.name
report.status = "FAIL"
report.status_extended = f"Group {group.name} has {group.visibility} visibility and should be Private."
if group.visibility != "Public":
report.status = "PASS"
report.status_extended = (
f"Group {group.name} has {group.visibility} visibility."
)
findings.append(report)
return findings

View File

@@ -0,0 +1,180 @@
from asyncio import gather, get_event_loop
from typing import List, Optional
from msgraph.generated.models.o_data_errors.o_data_error import ODataError
from pydantic import BaseModel
from prowler.lib.logger import logger
from prowler.providers.microsoft365.lib.service.service import Microsoft365Service
from prowler.providers.microsoft365.microsoft365_provider import Microsoft365Provider
class AdminCenter(Microsoft365Service):
def __init__(self, provider: Microsoft365Provider):
super().__init__(provider)
loop = get_event_loop()
# Get users first alone because it is a dependency for other attributes
self.users = loop.run_until_complete(self._get_users())
attributes = loop.run_until_complete(
gather(
self._get_directory_roles(),
self._get_groups(),
self._get_domains(),
)
)
self.directory_roles = attributes[0]
self.groups = attributes[1]
self.domains = attributes[2]
async def _get_users(self):
logger.info("Microsoft365 - Getting users...")
users = {}
try:
users_list = await self.client.users.get()
users.update({})
for user in users_list.value:
license_details = await self.client.users.by_user_id(
user.id
).license_details.get()
try:
mailbox_settings = await self.client.users.by_user_id(
user.id
).mailbox_settings.get()
mailbox_settings.user_purpose
except ODataError as error:
if error.error.code == "MailboxNotEnabledForRESTAPI":
logger.warning(
f"MailboxNotEnabledForRESTAPI for user {user.id}"
)
else:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
users.update(
{
user.id: User(
id=user.id,
name=user.display_name,
license=(
license_details.value[0].sku_part_number
if license_details.value
else None
),
)
}
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
return users
async def _get_directory_roles(self):
logger.info("Microsoft365 - Getting directory roles...")
directory_roles_with_members = {}
try:
directory_roles_with_members.update({})
directory_roles = await self.client.directory_roles.get()
for directory_role in directory_roles.value:
directory_role_members = (
await self.client.directory_roles.by_directory_role_id(
directory_role.id
).members.get()
)
members_with_roles = []
for member in directory_role_members.value:
user = self.users.get(member.id, None)
if user:
user.directory_roles.append(directory_role.display_name)
members_with_roles.append(user)
directory_roles_with_members.update(
{
directory_role.display_name: DirectoryRole(
id=directory_role.id,
name=directory_role.display_name,
members=members_with_roles,
)
}
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
return directory_roles_with_members
async def _get_groups(self):
logger.info("Microsoft365 - Getting groups...")
groups = {}
try:
groups_list = await self.client.groups.get()
groups.update({})
for group in groups_list.value:
groups.update(
{
group.id: Group(
id=group.id,
name=group.display_name,
visibility=group.visibility,
)
}
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
return groups
async def _get_domains(self):
logger.info("Microsoft365 - Getting domains...")
domains = {}
try:
domains_list = await self.client.domains.get()
domains.update({})
for domain in domains_list.value:
domains.update(
{
domain.id: Domain(
id=domain.id,
password_validity_period=domain.password_validity_period_in_days,
)
}
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
return domains
class User(BaseModel):
id: str
name: str
directory_roles: List[str] = []
license: Optional[str] = None
user_type: Optional[str] = None
class DirectoryRole(BaseModel):
id: str
name: str
members: List[User]
class Group(BaseModel):
id: str
name: str
visibility: str
class Domain(BaseModel):
id: str
password_validity_period: int

View File

@@ -0,0 +1,30 @@
{
"Provider": "microsoft365",
"CheckID": "admincenter_settings_password_never_expire",
"CheckTitle": "Ensure the 'Password expiration policy' is set to 'Set passwords to never expire (recommended)'",
"CheckType": [],
"ServiceName": "admincenter",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "medium",
"ResourceType": "Microsoft365Domain",
"Description": "This control ensures that the password expiration policy is set to 'Set passwords to never expire (recommended)'. This aligns with modern recommendations to enhance security by avoiding arbitrary password changes and focusing on supplementary controls like MFA.",
"Risk": "Arbitrary password expiration policies can lead to weaker passwords due to frequent changes. Users may adopt insecure habits such as using simple, memorable passwords.",
"RelatedUrl": "https://www.cisecurity.org/insights/white-papers/cis-password-policy-guide",
"Remediation": {
"Code": {
"CLI": "Set-MsolUser -UserPrincipalName <user> -PasswordNeverExpires $true",
"NativeIaC": "",
"Other": "",
"Terraform": ""
},
"Recommendation": {
"Text": "Enable the 'Never Expire Passwords' option in Microsoft 365 Admin Center.",
"Url": "https://learn.microsoft.com/en-us/microsoft-365/admin/misc/password-policy-recommendations?view=o365-worldwide"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""
}

View File

@@ -0,0 +1,26 @@
from prowler.lib.check.models import Check, Check_Report_Microsoft365
from prowler.providers.microsoft365.services.admincenter.admincenter_client import (
admincenter_client,
)
class admincenter_settings_password_never_expire(Check):
def execute(self) -> Check_Report_Microsoft365:
findings = []
for domain in admincenter_client.domains.values():
report = Check_Report_Microsoft365(self.metadata(), resource=domain)
report.resource_name = domain.id
report.status = "FAIL"
report.status_extended = (
f"Domain {domain.id} does not have a Password never expires policy."
)
if domain.password_validity_period == 2147483647:
report.status = "PASS"
report.status_extended = (
f"Domain {domain.id} Password policy is set to never expire."
)
findings.append(report)
return findings

View File

@@ -0,0 +1,30 @@
{
"Provider": "microsoft365",
"CheckID": "admincenter_users_admins_reduced_license_footprint",
"CheckTitle": "Ensure administrative accounts use licenses with a reduced application footprint",
"CheckType": [],
"ServiceName": "admincenter",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "medium",
"ResourceType": "AdministrativeAccount",
"Description": "Administrative accounts must use licenses with a reduced application footprint, such as Microsoft Entra ID P1 or P2, or avoid licenses entirely when possible. This minimizes the attack surface associated with privileged identities.",
"Risk": "Licensing administrative accounts with applications like email or collaborative tools increases their exposure to social engineering attacks and malicious content, putting privileged accounts at risk.",
"RelatedUrl": "https://learn.microsoft.com/en-us/microsoft-365/enterprise/protect-your-global-administrator-accounts?view=o365-worldwide",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "",
"Terraform": ""
},
"Recommendation": {
"Text": "Assign Microsoft Entra ID P1 or P2 licenses to administrative accounts to participate in essential security services without enabling access to vulnerable applications.",
"Url": "https://learn.microsoft.com/en-us/microsoft-365/admin/add-users/add-users?view=o365-worldwide"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""
}

View File

@@ -0,0 +1,35 @@
from prowler.lib.check.models import Check, Check_Report_Microsoft365
from prowler.providers.microsoft365.services.admincenter.admincenter_client import (
admincenter_client,
)
class admincenter_users_admins_reduced_license_footprint(Check):
def execute(self) -> Check_Report_Microsoft365:
findings = []
allowed_licenses = ["AAD_PREMIUM", "AAD_PREMIUM_P2"]
for user in admincenter_client.users.values():
admin_roles = ", ".join(
[
role
for role in user.directory_roles
if "Administrator" in role or "Globar Reader" in role
]
)
if admin_roles:
report = Check_Report_Microsoft365(
metadata=self.metadata(), resource=user
)
report.resource_id = user.id
report.resource_name = user.name
report.status = "FAIL"
report.status_extended = f"User {user.name} has administrative roles {admin_roles} and an invalid license {user.license if user.license else ''}."
if user.license in allowed_licenses:
report.status = "PASS"
report.status_extended = f"User {user.name} has administrative roles {admin_roles} and a valid license: {user.license}."
findings.append(report)
return findings

View File

@@ -0,0 +1,30 @@
{
"Provider": "microsoft365",
"CheckID": "admincenter_users_between_two_and_four_global_admins",
"CheckTitle": "Ensure that between two and four global admins are designated",
"CheckType": [],
"ServiceName": "admincenter",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "medium",
"ResourceType": "AdministrativeRole",
"Description": "Ensure that there are between two and four global administrators designated in your tenant. This ensures monitoring, redundancy, and reduces the risk associated with having too many privileged accounts.",
"Risk": "Having only one global administrator increases the risk of unmonitored actions and operational disruptions if that administrator is unavailable. Having more than four increases the likelihood of a breach through one of these highly privileged accounts.",
"RelatedUrl": "https://learn.microsoft.com/en-us/entra/identity/role-based-access-control/best-practices#5-limit-the-number-of-global-administrators-to-less-than-5",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "",
"Terraform": ""
},
"Recommendation": {
"Text": "Review the number of global administrators in your tenant. Add or remove global admins as necessary to ensure compliance with the recommended range of two to four.",
"Url": "https://learn.microsoft.com/en-us/entra/identity/role-based-access-control/manage-roles-portal"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""
}

View File

@@ -0,0 +1,37 @@
from prowler.lib.check.models import Check, Check_Report_Microsoft365
from prowler.providers.microsoft365.services.admincenter.admincenter_client import (
admincenter_client,
)
class admincenter_users_between_two_and_four_global_admins(Check):
def execute(self) -> Check_Report_Microsoft365:
findings = []
directory_roles = admincenter_client.directory_roles
report = Check_Report_Microsoft365(metadata=self.metadata(), resource={})
report.status = "FAIL"
report.resource_name = "Global Administrator"
if "Global Administrator" in directory_roles:
report.resource_id = getattr(
directory_roles["Global Administrator"],
"id",
"Global Administrator",
)
num_global_admins = len(
getattr(directory_roles["Global Administrator"], "members", [])
)
if num_global_admins >= 2 and num_global_admins < 5:
report.status = "PASS"
report.status_extended = (
f"There are {num_global_admins} global administrators."
)
else:
report.status_extended = f"There are {num_global_admins} global administrators. It should be more than one and less than five."
findings.append(report)
return findings

View File

@@ -0,0 +1,4 @@
from prowler.providers.common.provider import Provider
from prowler.providers.microsoft365.services.entra.entra_service import Entra
entra_client = Entra(Provider.get_global_provider())

View File

@@ -0,0 +1,101 @@
from asyncio import gather, get_event_loop
from typing import List, Optional
from pydantic import BaseModel
from prowler.lib.logger import logger
from prowler.providers.microsoft365.lib.service.service import Microsoft365Service
from prowler.providers.microsoft365.microsoft365_provider import Microsoft365Provider
class Entra(Microsoft365Service):
def __init__(self, provider: Microsoft365Provider):
super().__init__(provider)
loop = get_event_loop()
attributes = loop.run_until_complete(
gather(
self._get_authorization_policy(),
)
)
self.authorization_policy = attributes[0]
async def _get_authorization_policy(self):
logger.info("Entra - Getting authorization policy...")
authorization_policy = None
try:
auth_policy = await self.client.policies.authorization_policy.get()
default_user_role_permissions = getattr(
auth_policy, "default_user_role_permissions", None
)
authorization_policy = AuthorizationPolicy(
id=auth_policy.id,
name=auth_policy.display_name,
description=auth_policy.description,
default_user_role_permissions=DefaultUserRolePermissions(
allowed_to_create_apps=getattr(
default_user_role_permissions,
"allowed_to_create_apps",
None,
),
allowed_to_create_security_groups=getattr(
default_user_role_permissions,
"allowed_to_create_security_groups",
None,
),
allowed_to_create_tenants=getattr(
default_user_role_permissions,
"allowed_to_create_tenants",
None,
),
allowed_to_read_bitlocker_keys_for_owned_device=getattr(
default_user_role_permissions,
"allowed_to_read_bitlocker_keys_for_owned_device",
None,
),
allowed_to_read_other_users=getattr(
default_user_role_permissions,
"allowed_to_read_other_users",
None,
),
odata_type=getattr(
default_user_role_permissions, "odata_type", None
),
permission_grant_policies_assigned=[
policy_assigned
for policy_assigned in getattr(
default_user_role_permissions,
"permission_grant_policies_assigned",
[],
)
],
),
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
return authorization_policy
class DefaultUserRolePermissions(BaseModel):
allowed_to_create_apps: Optional[bool]
allowed_to_create_security_groups: Optional[bool]
allowed_to_create_tenants: Optional[bool]
allowed_to_read_bitlocker_keys_for_owned_device: Optional[bool]
allowed_to_read_other_users: Optional[bool]
odata_type: Optional[str]
permission_grant_policies_assigned: Optional[List[str]] = None
class AuthorizationPolicy(BaseModel):
id: str
name: str
description: str
default_user_role_permissions: Optional[DefaultUserRolePermissions]

Some files were not shown because too many files have changed in this diff Show More