mirror of
https://github.com/prowler-cloud/prowler.git
synced 2026-04-03 22:17:03 +00:00
Compare commits
3 Commits
docs/sso-g
...
poc-gha-ia
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
5bf816ee42 | ||
|
|
42ab40d079 | ||
|
|
2ce706e474 |
@@ -137,7 +137,6 @@
|
||||
"group": "Tutorials",
|
||||
"pages": [
|
||||
"user-guide/tutorials/prowler-app-sso-entra",
|
||||
"user-guide/tutorials/prowler-app-sso-google-workspace",
|
||||
"user-guide/tutorials/bulk-provider-provisioning",
|
||||
"user-guide/tutorials/aws-organizations-bulk-provisioning"
|
||||
]
|
||||
|
||||
@@ -1,197 +0,0 @@
|
||||
---
|
||||
title: 'SAML SSO: Google Workspace'
|
||||
---
|
||||
|
||||
This page explains how to configure SAML-based Single Sign-On (SSO) in Prowler App using **Google Workspace** as the Identity Provider (IdP). The setup is divided into two parts: create a custom SAML app in Google Admin Console, then complete the configuration in Prowler App.
|
||||
|
||||
<Info>
|
||||
**Parallel Setup Required**
|
||||
|
||||
Google Admin Console requires the ACS URL and Entity ID from Prowler App, while Prowler displays these values only after enabling SAML. To work around this, open Prowler App in a separate browser tab, navigate to Profile, enable SAML, and copy the ACS URL and Entity ID before proceeding with the Google configuration.
|
||||
|
||||
</Info>
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- **Google Workspace**: Super Admin access (or delegated admin with app management permissions).
|
||||
- **Prowler App**: Administrator access to the organization (role with "Manage Account" permission).
|
||||
- Prowler App version **5.9.0** or later.
|
||||
|
||||
---
|
||||
|
||||
## Part A — Google Admin Console
|
||||
|
||||
### Step 1: Create the Custom SAML App
|
||||
|
||||
1. Go to [admin.google.com](https://admin.google.com).
|
||||
2. Navigate to **Apps > Web and mobile apps**.
|
||||
3. Click "Add app", then select "Add custom SAML app".
|
||||
4. Enter the app name (e.g., `Prowler Cloud`) and optionally upload a logo.
|
||||
5. Click "Continue".
|
||||
|
||||
### Step 2: Download the IdP Metadata
|
||||
|
||||
On the **Google Identity Provider details** screen:
|
||||
|
||||
1. Google displays the **SSO URL**, **Entity ID**, and **Certificate**.
|
||||
2. Click "Download Metadata" to save the XML file. This file is required to complete the Prowler App configuration in Part B.
|
||||
3. Click "Continue".
|
||||
|
||||
<Warning>
|
||||
**Save the Metadata File**
|
||||
|
||||
Download and save the IdP metadata XML file before proceeding. This file cannot be easily retrieved later and is required to complete the SAML configuration in Prowler App.
|
||||
|
||||
</Warning>
|
||||
|
||||
### Step 3: Configure the Service Provider Details
|
||||
|
||||
Enter the following values obtained from the SAML SSO integration setup in Prowler App. For detailed instructions on where to find these values, refer to the [SAML SSO Configuration](/user-guide/tutorials/prowler-app-sso) page.
|
||||
|
||||
| Google Workspace Field | Prowler Value |
|
||||
|------------------------|---------------|
|
||||
| **ACS URL** | The Assertion Consumer Service (ACS) URL shown in Prowler App. Self-hosted deployments will have a different base URL. |
|
||||
| **Entity ID** | The Audience URI (Entity ID) shown in Prowler App |
|
||||
| **Name ID format** | `EMAIL` |
|
||||
| **Name ID** | `Basic Information > Primary email` |
|
||||
|
||||
Additionally:
|
||||
|
||||
- Enable "Sign SAML response and assertion" (both must be enabled).
|
||||
- Click "Continue".
|
||||
|
||||
### Step 4: Configure Attribute Mapping
|
||||
|
||||
<Info>
|
||||
**Dynamic Updates**
|
||||
|
||||
Prowler App updates user attributes each time a user logs in. Any changes made in the IdP will be reflected on the next login.
|
||||
|
||||
</Info>
|
||||
|
||||
To correctly provision users, configure the IdP to send the following attributes in the SAML assertion by clicking "Add mapping" for each entry:
|
||||
|
||||
| Google Directory Attribute | App Attribute (SAML) | Required | Notes |
|
||||
|----------------------------|----------------------|----------|-------|
|
||||
| `First name` | `firstName` | Yes | |
|
||||
| `Last name` | `lastName` | Yes | |
|
||||
| `Department` (or any custom attribute) | `userType` | No | Determines the Prowler role. **Case-sensitive.** |
|
||||
| `Organization name` | `companyName` | No | Company name in Prowler profile. |
|
||||
|
||||
Click "Finish".
|
||||
|
||||
<Warning>
|
||||
**Role Assignment via `userType`**
|
||||
|
||||
The `userType` attribute controls which Prowler role is assigned to the user:
|
||||
|
||||
- If `userType` matches an existing Prowler role name, the user receives that role automatically.
|
||||
- If `userType` does not match any existing role, Prowler App creates a new role with that name **without permissions**.
|
||||
- If `userType` is not set, the user receives the `no_permissions` role.
|
||||
|
||||
In all cases where the resulting role has no permissions, a Prowler administrator must configure the appropriate permissions through the [RBAC Management](/user-guide/tutorials/prowler-app-rbac) tab. The `userType` value is **case-sensitive** — for example, `Admin` and `admin` are treated as different roles.
|
||||
|
||||
</Warning>
|
||||
|
||||
### Step 5: Enable the App for Users
|
||||
|
||||
1. Return to **Apps > Web and mobile apps** and select the newly created SAML app.
|
||||
2. Click "User access".
|
||||
3. Set the service status to **ON for everyone**, or enable it for specific organizational units or groups.
|
||||
4. Click "Save".
|
||||
|
||||
<Info>
|
||||
**Propagation Delay**
|
||||
|
||||
Changes to the app status can take up to 24 hours to propagate across Google Workspace, although they typically take effect within a few minutes.
|
||||
|
||||
</Info>
|
||||
|
||||
---
|
||||
|
||||
## Part B — Prowler App Configuration
|
||||
|
||||
### Step 1: Access Profile Settings
|
||||
|
||||
Navigate to the profile settings page:
|
||||
|
||||
- **Prowler Cloud**: `https://cloud.prowler.com/profile`
|
||||
- **Self-hosted**: `http://{your-domain}/profile`
|
||||
|
||||
### Step 2: Enable SAML SSO
|
||||
|
||||
1. Find the "SAML SSO Integration" card.
|
||||
2. Click "Enable". This reveals the **ACS URL** and **Audience URI (Entity ID)** that Google Admin Console needs for Part A, Step 3. Copy these values before proceeding.
|
||||
|
||||
### Step 3: Upload Metadata and Configure Email Domain
|
||||
|
||||
1. Enter the **email domain** for the organization (e.g., `acme.com`). Prowler App uses this to identify users who should authenticate via SAML.
|
||||
2. Upload the **metadata XML file** downloaded in Part A, Step 2.
|
||||
3. Click "Save".
|
||||
|
||||
### Step 4: Verify Active Status
|
||||
|
||||
The "SAML Integration" card should now display an **"Active"** status, indicating that the configuration is complete.
|
||||
|
||||
---
|
||||
|
||||
## Testing the Integration
|
||||
|
||||
### SP-Initiated SSO (from Prowler)
|
||||
|
||||
1. Navigate to the Prowler login page.
|
||||
2. Click "Continue with SAML SSO".
|
||||
3. Enter an email from the configured domain (e.g., `user@acme.com`).
|
||||
4. The browser redirects to Google for authentication and returns to Prowler upon success.
|
||||
|
||||
### IdP-Initiated SSO (from Google)
|
||||
|
||||
1. Open the Google Workspace app launcher (the grid icon in the top-right corner of any Google page).
|
||||
2. Click the Prowler Cloud app tile.
|
||||
3. The browser redirects directly to Prowler App, authenticated.
|
||||
|
||||
For more information on the SSO login flows, refer to the [SAML SSO Configuration](/user-guide/tutorials/prowler-app-sso#idp-initiated-sso) page.
|
||||
|
||||
---
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
<Warning>
|
||||
**User Lockout After Misconfiguration**
|
||||
|
||||
If SAML is configured with incorrect metadata or an incorrect domain, users who authenticated via SAML cannot fall back to password login. A Prowler administrator must remove the SAML configuration via the API:
|
||||
|
||||
```bash
|
||||
curl -X DELETE 'https://api.prowler.com/api/v1/saml-config' \
|
||||
-H 'Authorization: Bearer <ADMIN_TOKEN>' \
|
||||
-H 'Accept: application/vnd.api+json'
|
||||
```
|
||||
|
||||
After removal, affected users must reset their password to regain access using standard email and password login. This also applies when SAML is intentionally removed — all SAML-authenticated users will need to reset their password. For more details, refer to the [SAML API Reference](/user-guide/tutorials/prowler-app-sso#saml-api-reference). For additional support, contact [Prowler Support](https://docs.prowler.com/user-guide/contact-support).
|
||||
|
||||
</Warning>
|
||||
|
||||
<Info>
|
||||
**Email Domain Uniqueness**
|
||||
|
||||
Prowler does not allow two tenants to share the same email domain. If the domain is already associated with another tenant, the configuration will fail. This is by design to prevent authentication ambiguity.
|
||||
|
||||
</Info>
|
||||
|
||||
<Info>
|
||||
**Just-in-Time Provisioning**
|
||||
|
||||
Users who authenticate via SAML for the first time are automatically created in Prowler App. No prior invitation is needed. User attributes (`firstName`, `lastName`, `userType`) are updated on every login from the Google directory.
|
||||
|
||||
</Info>
|
||||
|
||||
---
|
||||
|
||||
## Quick Summary
|
||||
|
||||
1. In **Google Admin Console**, create a custom SAML app using the ACS URL and Entity ID from Prowler App.
|
||||
2. Configure **attribute mapping**: `firstName`, `lastName`, and optionally `userType`.
|
||||
3. **Download the metadata XML** from Google.
|
||||
4. In **Prowler App**, upload the metadata XML and set the **email domain**.
|
||||
5. **Activate the app** in Google Workspace for the relevant users or groups.
|
||||
6. Test login via "Continue with SAML SSO" on the Prowler login page.
|
||||
@@ -75,7 +75,7 @@ Choose a Method:
|
||||
<Info>
|
||||
**IdP Configuration**
|
||||
|
||||
The exact steps for configuring an IdP vary depending on the provider (Okta, Azure AD, Google Workspace, etc.). Please refer to the IdP's documentation for instructions on creating a SAML application. For SSO integration with Azure AD / Entra ID, see our [Entra ID configuration instructions](/user-guide/tutorials/prowler-app-sso-entra). For Google Workspace, see our [Google Workspace configuration instructions](/user-guide/tutorials/prowler-app-sso-google-workspace).
|
||||
The exact steps for configuring an IdP vary depending on the provider (Okta, Azure AD, etc.). Please refer to the IdP's documentation for instructions on creating a SAML application. For SSO integration with Azure AD / Entra ID, see our [Entra ID configuration instructions](/user-guide/tutorials/prowler-app-sso-entra).
|
||||
|
||||
</Info>
|
||||
|
||||
|
||||
@@ -18,6 +18,7 @@ from prowler.config.config import (
|
||||
json_asff_file_suffix,
|
||||
json_ocsf_file_suffix,
|
||||
orange_color,
|
||||
sarif_file_suffix,
|
||||
)
|
||||
from prowler.lib.banner import print_banner
|
||||
from prowler.lib.check.check import (
|
||||
@@ -69,11 +70,11 @@ from prowler.lib.outputs.compliance.cis.cis_gcp import GCPCIS
|
||||
from prowler.lib.outputs.compliance.cis.cis_github import GithubCIS
|
||||
from prowler.lib.outputs.compliance.cis.cis_googleworkspace import GoogleWorkspaceCIS
|
||||
from prowler.lib.outputs.compliance.cis.cis_kubernetes import KubernetesCIS
|
||||
from prowler.lib.outputs.compliance.cis.cis_m365 import M365CIS
|
||||
from prowler.lib.outputs.compliance.cis.cis_oraclecloud import OracleCloudCIS
|
||||
from prowler.lib.outputs.compliance.cisa_scuba.cisa_scuba_googleworkspace import (
|
||||
GoogleWorkspaceCISASCuBA,
|
||||
)
|
||||
from prowler.lib.outputs.compliance.cis.cis_m365 import M365CIS
|
||||
from prowler.lib.outputs.compliance.cis.cis_oraclecloud import OracleCloudCIS
|
||||
from prowler.lib.outputs.compliance.compliance import display_compliance_table
|
||||
from prowler.lib.outputs.compliance.csa.csa_alibabacloud import AlibabaCloudCSA
|
||||
from prowler.lib.outputs.compliance.csa.csa_aws import AWSCSA
|
||||
@@ -122,6 +123,7 @@ from prowler.lib.outputs.html.html import HTML
|
||||
from prowler.lib.outputs.ocsf.ingestion import send_ocsf_to_api
|
||||
from prowler.lib.outputs.ocsf.ocsf import OCSF
|
||||
from prowler.lib.outputs.outputs import extract_findings_statistics, report
|
||||
from prowler.lib.outputs.sarif.sarif import SARIF
|
||||
from prowler.lib.outputs.slack.slack import Slack
|
||||
from prowler.lib.outputs.summary_table import display_summary_table
|
||||
from prowler.providers.alibabacloud.models import AlibabaCloudOutputOptions
|
||||
@@ -546,6 +548,13 @@ def prowler():
|
||||
html_output.batch_write_data_to_file(
|
||||
provider=global_provider, stats=stats
|
||||
)
|
||||
if mode == "sarif":
|
||||
sarif_output = SARIF(
|
||||
findings=finding_outputs,
|
||||
file_path=f"{filename}{sarif_file_suffix}",
|
||||
)
|
||||
generated_outputs["regular"].append(sarif_output)
|
||||
sarif_output.batch_write_data_to_file()
|
||||
|
||||
if getattr(args, "push_to_cloud", False):
|
||||
if not ocsf_output or not getattr(ocsf_output, "file_path", None):
|
||||
|
||||
@@ -110,6 +110,7 @@ json_file_suffix = ".json"
|
||||
json_asff_file_suffix = ".asff.json"
|
||||
json_ocsf_file_suffix = ".ocsf.json"
|
||||
html_file_suffix = ".html"
|
||||
sarif_file_suffix = ".sarif"
|
||||
default_config_file_path = (
|
||||
f"{pathlib.Path(os.path.dirname(os.path.realpath(__file__)))}/config.yaml"
|
||||
)
|
||||
@@ -120,7 +121,7 @@ default_redteam_config_file_path = (
|
||||
f"{pathlib.Path(os.path.dirname(os.path.realpath(__file__)))}/llm_config.yaml"
|
||||
)
|
||||
encoding_format_utf_8 = "utf-8"
|
||||
available_output_formats = ["csv", "json-asff", "json-ocsf", "html"]
|
||||
available_output_formats = ["csv", "json-asff", "json-ocsf", "html", "sarif"]
|
||||
|
||||
# Prowler Cloud API settings
|
||||
cloud_api_base_url = os.getenv("PROWLER_CLOUD_API_BASE_URL", "https://api.prowler.com")
|
||||
|
||||
@@ -354,6 +354,9 @@ class Finding(BaseModel):
|
||||
check_output, "resource_line_range", ""
|
||||
)
|
||||
output_data["framework"] = check_output.check_metadata.ServiceName
|
||||
output_data["raw"] = {
|
||||
"resource_line_range": output_data.get("resource_line_range", ""),
|
||||
}
|
||||
|
||||
elif provider.type == "llm":
|
||||
output_data["auth_method"] = provider.auth_method
|
||||
|
||||
0
prowler/lib/outputs/sarif/__init__.py
Normal file
0
prowler/lib/outputs/sarif/__init__.py
Normal file
155
prowler/lib/outputs/sarif/sarif.py
Normal file
155
prowler/lib/outputs/sarif/sarif.py
Normal file
@@ -0,0 +1,155 @@
|
||||
from json import dump
|
||||
from typing import List
|
||||
|
||||
from prowler.config.config import prowler_version
|
||||
from prowler.lib.logger import logger
|
||||
from prowler.lib.outputs.finding import Finding
|
||||
from prowler.lib.outputs.output import Output
|
||||
|
||||
SARIF_SCHEMA_URL = "https://json.schemastore.org/sarif-2.1.0.json"
|
||||
SARIF_VERSION = "2.1.0"
|
||||
|
||||
SEVERITY_TO_SARIF_LEVEL = {
|
||||
"critical": "error",
|
||||
"high": "error",
|
||||
"medium": "warning",
|
||||
"low": "note",
|
||||
"informational": "note",
|
||||
}
|
||||
|
||||
SEVERITY_TO_SECURITY_SEVERITY = {
|
||||
"critical": "9.0",
|
||||
"high": "7.0",
|
||||
"medium": "4.0",
|
||||
"low": "2.0",
|
||||
"informational": "0.0",
|
||||
}
|
||||
|
||||
|
||||
class SARIF(Output):
|
||||
"""Generates SARIF 2.1.0 output compatible with GitHub Code Scanning."""
|
||||
|
||||
def transform(self, findings: List[Finding]) -> None:
|
||||
rules = {}
|
||||
results = []
|
||||
|
||||
for finding in findings:
|
||||
if finding.status != "FAIL":
|
||||
continue
|
||||
|
||||
check_id = finding.metadata.CheckID
|
||||
severity = finding.metadata.Severity.lower()
|
||||
|
||||
if check_id not in rules:
|
||||
rule = {
|
||||
"id": check_id,
|
||||
"name": check_id,
|
||||
"shortDescription": {"text": finding.metadata.CheckTitle},
|
||||
"fullDescription": {
|
||||
"text": finding.metadata.Description or check_id
|
||||
},
|
||||
"help": {
|
||||
"text": finding.metadata.Remediation.Recommendation.Text
|
||||
or finding.metadata.Description
|
||||
or check_id,
|
||||
},
|
||||
"defaultConfiguration": {
|
||||
"level": SEVERITY_TO_SARIF_LEVEL.get(severity, "note"),
|
||||
},
|
||||
"properties": {
|
||||
"tags": [
|
||||
"security",
|
||||
f"prowler/{finding.metadata.Provider}",
|
||||
f"severity/{severity}",
|
||||
],
|
||||
"security-severity": SEVERITY_TO_SECURITY_SEVERITY.get(
|
||||
severity, "0.0"
|
||||
),
|
||||
},
|
||||
}
|
||||
if finding.metadata.RelatedUrl:
|
||||
rule["helpUri"] = finding.metadata.RelatedUrl
|
||||
rules[check_id] = rule
|
||||
|
||||
rule_index = list(rules.keys()).index(check_id)
|
||||
result = {
|
||||
"ruleId": check_id,
|
||||
"ruleIndex": rule_index,
|
||||
"level": SEVERITY_TO_SARIF_LEVEL.get(severity, "note"),
|
||||
"message": {
|
||||
"text": finding.status_extended or finding.metadata.CheckTitle
|
||||
},
|
||||
}
|
||||
|
||||
location = self._build_location(finding)
|
||||
if location:
|
||||
result["locations"] = [location]
|
||||
|
||||
results.append(result)
|
||||
|
||||
sarif_document = {
|
||||
"$schema": SARIF_SCHEMA_URL,
|
||||
"version": SARIF_VERSION,
|
||||
"runs": [
|
||||
{
|
||||
"tool": {
|
||||
"driver": {
|
||||
"name": "Prowler",
|
||||
"version": prowler_version,
|
||||
"informationUri": "https://prowler.com",
|
||||
"rules": list(rules.values()),
|
||||
},
|
||||
},
|
||||
"results": results,
|
||||
},
|
||||
],
|
||||
}
|
||||
|
||||
self._data = sarif_document
|
||||
|
||||
def batch_write_data_to_file(self) -> None:
|
||||
try:
|
||||
if (
|
||||
getattr(self, "_file_descriptor", None)
|
||||
and not self._file_descriptor.closed
|
||||
and self._data
|
||||
):
|
||||
dump(self._data, self._file_descriptor, indent=2)
|
||||
self._file_descriptor.close()
|
||||
except Exception as error:
|
||||
logger.error(
|
||||
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def _build_location(finding: Finding) -> dict:
|
||||
"""Build a SARIF physicalLocation from a Finding.
|
||||
|
||||
Uses resource_name as the artifact URI and resource_line_range
|
||||
(stored in finding.raw for IaC findings) for region info.
|
||||
"""
|
||||
if not finding.resource_name:
|
||||
return {}
|
||||
|
||||
location = {
|
||||
"physicalLocation": {
|
||||
"artifactLocation": {
|
||||
"uri": finding.resource_name,
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
line_range = finding.raw.get("resource_line_range", "")
|
||||
if line_range and ":" in line_range:
|
||||
parts = line_range.split(":")
|
||||
try:
|
||||
start_line = int(parts[0])
|
||||
end_line = int(parts[1])
|
||||
location["physicalLocation"]["region"] = {
|
||||
"startLine": start_line,
|
||||
"endLine": end_line,
|
||||
}
|
||||
except (ValueError, IndexError):
|
||||
pass
|
||||
|
||||
return location
|
||||
@@ -9,6 +9,7 @@ from prowler.config.config import (
|
||||
json_asff_file_suffix,
|
||||
json_ocsf_file_suffix,
|
||||
orange_color,
|
||||
sarif_file_suffix,
|
||||
)
|
||||
from prowler.lib.logger import logger
|
||||
from prowler.providers.github.models import GithubAppIdentityInfo, GithubIdentityInfo
|
||||
@@ -207,6 +208,10 @@ def display_summary_table(
|
||||
print(
|
||||
f" - HTML: {output_directory}/{output_filename}{html_file_suffix}"
|
||||
)
|
||||
if "sarif" in output_options.output_modes:
|
||||
print(
|
||||
f" - SARIF: {output_directory}/{output_filename}{sarif_file_suffix}"
|
||||
)
|
||||
|
||||
else:
|
||||
print(
|
||||
|
||||
0
tests/lib/outputs/sarif/__init__.py
Normal file
0
tests/lib/outputs/sarif/__init__.py
Normal file
257
tests/lib/outputs/sarif/sarif_test.py
Normal file
257
tests/lib/outputs/sarif/sarif_test.py
Normal file
@@ -0,0 +1,257 @@
|
||||
import json
|
||||
import tempfile
|
||||
|
||||
import pytest
|
||||
|
||||
from prowler.lib.outputs.sarif.sarif import SARIF, SARIF_SCHEMA_URL, SARIF_VERSION
|
||||
from tests.lib.outputs.fixtures.fixtures import generate_finding_output
|
||||
|
||||
|
||||
class TestSARIF:
|
||||
def test_transform_fail_finding(self):
|
||||
finding = generate_finding_output(
|
||||
status="FAIL",
|
||||
status_extended="S3 bucket is not encrypted",
|
||||
severity="high",
|
||||
resource_name="main.tf",
|
||||
service_name="s3",
|
||||
check_id="s3_encryption_check",
|
||||
check_title="S3 Bucket Encryption",
|
||||
)
|
||||
sarif = SARIF(findings=[finding], file_path=None)
|
||||
|
||||
assert sarif.data["$schema"] == SARIF_SCHEMA_URL
|
||||
assert sarif.data["version"] == SARIF_VERSION
|
||||
assert len(sarif.data["runs"]) == 1
|
||||
|
||||
run = sarif.data["runs"][0]
|
||||
assert run["tool"]["driver"]["name"] == "Prowler"
|
||||
assert len(run["tool"]["driver"]["rules"]) == 1
|
||||
assert len(run["results"]) == 1
|
||||
|
||||
rule = run["tool"]["driver"]["rules"][0]
|
||||
assert rule["id"] == "s3_encryption_check"
|
||||
assert rule["shortDescription"]["text"] == "S3 Bucket Encryption"
|
||||
assert rule["defaultConfiguration"]["level"] == "error"
|
||||
assert rule["properties"]["security-severity"] == "7.0"
|
||||
|
||||
result = run["results"][0]
|
||||
assert result["ruleId"] == "s3_encryption_check"
|
||||
assert result["ruleIndex"] == 0
|
||||
assert result["level"] == "error"
|
||||
assert result["message"]["text"] == "S3 bucket is not encrypted"
|
||||
|
||||
def test_transform_pass_finding_excluded(self):
|
||||
finding = generate_finding_output(status="PASS", severity="high")
|
||||
sarif = SARIF(findings=[finding], file_path=None)
|
||||
|
||||
run = sarif.data["runs"][0]
|
||||
assert len(run["results"]) == 0
|
||||
assert len(run["tool"]["driver"]["rules"]) == 0
|
||||
|
||||
def test_transform_muted_finding_excluded(self):
|
||||
finding = generate_finding_output(status="FAIL", severity="high", muted=True)
|
||||
sarif = SARIF(findings=[finding], file_path=None)
|
||||
run = sarif.data["runs"][0]
|
||||
assert len(run["results"]) == 1
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"severity,expected_level,expected_security_severity",
|
||||
[
|
||||
("critical", "error", "9.0"),
|
||||
("high", "error", "7.0"),
|
||||
("medium", "warning", "4.0"),
|
||||
("low", "note", "2.0"),
|
||||
("informational", "note", "0.0"),
|
||||
],
|
||||
)
|
||||
def test_transform_severity_mapping(
|
||||
self, severity, expected_level, expected_security_severity
|
||||
):
|
||||
finding = generate_finding_output(
|
||||
status="FAIL",
|
||||
severity=severity,
|
||||
)
|
||||
sarif = SARIF(findings=[finding], file_path=None)
|
||||
|
||||
run = sarif.data["runs"][0]
|
||||
result = run["results"][0]
|
||||
rule = run["tool"]["driver"]["rules"][0]
|
||||
|
||||
assert result["level"] == expected_level
|
||||
assert rule["defaultConfiguration"]["level"] == expected_level
|
||||
assert rule["properties"]["security-severity"] == expected_security_severity
|
||||
|
||||
def test_transform_multiple_findings_dedup_rules(self):
|
||||
findings = [
|
||||
generate_finding_output(
|
||||
status="FAIL",
|
||||
resource_name="file1.tf",
|
||||
status_extended="Finding in file1",
|
||||
),
|
||||
generate_finding_output(
|
||||
status="FAIL",
|
||||
resource_name="file2.tf",
|
||||
status_extended="Finding in file2",
|
||||
),
|
||||
]
|
||||
sarif = SARIF(findings=findings, file_path=None)
|
||||
|
||||
run = sarif.data["runs"][0]
|
||||
assert len(run["tool"]["driver"]["rules"]) == 1
|
||||
assert len(run["results"]) == 2
|
||||
assert run["results"][0]["ruleIndex"] == 0
|
||||
assert run["results"][1]["ruleIndex"] == 0
|
||||
|
||||
def test_transform_multiple_different_rules(self):
|
||||
findings = [
|
||||
generate_finding_output(
|
||||
status="FAIL",
|
||||
service_name="alpha",
|
||||
check_id="alpha_check_one",
|
||||
status_extended="Finding A",
|
||||
),
|
||||
generate_finding_output(
|
||||
status="FAIL",
|
||||
service_name="beta",
|
||||
check_id="beta_check_two",
|
||||
status_extended="Finding B",
|
||||
),
|
||||
]
|
||||
sarif = SARIF(findings=findings, file_path=None)
|
||||
|
||||
run = sarif.data["runs"][0]
|
||||
assert len(run["tool"]["driver"]["rules"]) == 2
|
||||
assert run["results"][0]["ruleIndex"] == 0
|
||||
assert run["results"][1]["ruleIndex"] == 1
|
||||
|
||||
def test_transform_location_with_line_range(self):
|
||||
finding = generate_finding_output(
|
||||
status="FAIL",
|
||||
resource_name="modules/s3/main.tf",
|
||||
)
|
||||
finding.raw = {"resource_line_range": "10:25"}
|
||||
|
||||
sarif = SARIF(findings=[finding], file_path=None)
|
||||
|
||||
result = sarif.data["runs"][0]["results"][0]
|
||||
location = result["locations"][0]["physicalLocation"]
|
||||
assert location["artifactLocation"]["uri"] == "modules/s3/main.tf"
|
||||
assert location["region"]["startLine"] == 10
|
||||
assert location["region"]["endLine"] == 25
|
||||
|
||||
def test_transform_location_without_line_range(self):
|
||||
finding = generate_finding_output(
|
||||
status="FAIL",
|
||||
resource_name="main.tf",
|
||||
)
|
||||
sarif = SARIF(findings=[finding], file_path=None)
|
||||
|
||||
result = sarif.data["runs"][0]["results"][0]
|
||||
location = result["locations"][0]["physicalLocation"]
|
||||
assert location["artifactLocation"]["uri"] == "main.tf"
|
||||
assert "region" not in location
|
||||
|
||||
def test_transform_no_resource_name(self):
|
||||
finding = generate_finding_output(
|
||||
status="FAIL",
|
||||
resource_name="",
|
||||
)
|
||||
sarif = SARIF(findings=[finding], file_path=None)
|
||||
|
||||
result = sarif.data["runs"][0]["results"][0]
|
||||
assert "locations" not in result
|
||||
|
||||
def test_batch_write_data_to_file(self):
|
||||
finding = generate_finding_output(
|
||||
status="FAIL",
|
||||
status_extended="test finding",
|
||||
resource_name="main.tf",
|
||||
)
|
||||
|
||||
with tempfile.NamedTemporaryFile(
|
||||
mode="w", suffix=".sarif", delete=False
|
||||
) as tmp:
|
||||
tmp_path = tmp.name
|
||||
|
||||
sarif = SARIF(
|
||||
findings=[finding],
|
||||
file_path=tmp_path,
|
||||
)
|
||||
sarif.batch_write_data_to_file()
|
||||
|
||||
with open(tmp_path) as f:
|
||||
content = json.load(f)
|
||||
|
||||
assert content["$schema"] == SARIF_SCHEMA_URL
|
||||
assert content["version"] == SARIF_VERSION
|
||||
assert len(content["runs"][0]["results"]) == 1
|
||||
|
||||
def test_sarif_schema_structure(self):
|
||||
finding = generate_finding_output(
|
||||
status="FAIL",
|
||||
severity="critical",
|
||||
resource_name="infra/main.tf",
|
||||
service_name="iac",
|
||||
check_id="iac_misconfig_check",
|
||||
check_title="IaC Misconfiguration",
|
||||
description="Checks for misconfigurations",
|
||||
remediation_recommendation_text="Fix the configuration",
|
||||
)
|
||||
finding.raw = {"resource_line_range": "5:15"}
|
||||
|
||||
sarif = SARIF(findings=[finding], file_path=None)
|
||||
doc = sarif.data
|
||||
|
||||
assert "$schema" in doc
|
||||
assert "version" in doc
|
||||
assert "runs" in doc
|
||||
|
||||
run = doc["runs"][0]
|
||||
|
||||
assert "tool" in run
|
||||
assert "driver" in run["tool"]
|
||||
driver = run["tool"]["driver"]
|
||||
assert "name" in driver
|
||||
assert "version" in driver
|
||||
assert "informationUri" in driver
|
||||
assert "rules" in driver
|
||||
|
||||
rule = driver["rules"][0]
|
||||
assert "id" in rule
|
||||
assert "shortDescription" in rule
|
||||
assert "fullDescription" in rule
|
||||
assert "help" in rule
|
||||
assert "defaultConfiguration" in rule
|
||||
assert "properties" in rule
|
||||
assert "tags" in rule["properties"]
|
||||
assert "security-severity" in rule["properties"]
|
||||
|
||||
result = run["results"][0]
|
||||
assert "ruleId" in result
|
||||
assert "ruleIndex" in result
|
||||
assert "level" in result
|
||||
assert "message" in result
|
||||
assert "locations" in result
|
||||
|
||||
loc = result["locations"][0]["physicalLocation"]
|
||||
assert "artifactLocation" in loc
|
||||
assert "uri" in loc["artifactLocation"]
|
||||
assert "region" in loc
|
||||
assert "startLine" in loc["region"]
|
||||
assert "endLine" in loc["region"]
|
||||
|
||||
def test_empty_findings(self):
|
||||
sarif = SARIF(findings=[], file_path=None)
|
||||
assert sarif.data == []
|
||||
|
||||
def test_only_pass_findings(self):
|
||||
findings = [
|
||||
generate_finding_output(status="PASS"),
|
||||
generate_finding_output(status="PASS"),
|
||||
]
|
||||
sarif = SARIF(findings=findings, file_path=None)
|
||||
|
||||
run = sarif.data["runs"][0]
|
||||
assert len(run["results"]) == 0
|
||||
assert len(run["tool"]["driver"]["rules"]) == 0
|
||||
Reference in New Issue
Block a user