diff --git a/docs/docs.json b/docs/docs.json
index 3ec056989f..6fcd387a07 100644
--- a/docs/docs.json
+++ b/docs/docs.json
@@ -304,6 +304,13 @@
"pages": [
"user-guide/compliance/tutorials/threatscore"
]
+ },
+ {
+ "group": "Cookbooks",
+ "pages": [
+ "user-guide/cookbooks/kubernetes-in-cluster",
+ "user-guide/cookbooks/cicd-pipeline"
+ ]
}
]
},
diff --git a/docs/user-guide/cookbooks/cicd-pipeline.mdx b/docs/user-guide/cookbooks/cicd-pipeline.mdx
new file mode 100644
index 0000000000..1c1608d75e
--- /dev/null
+++ b/docs/user-guide/cookbooks/cicd-pipeline.mdx
@@ -0,0 +1,239 @@
+---
+title: 'Run Prowler in CI/CD and Send Findings to Prowler Cloud'
+---
+
+This cookbook demonstrates how to integrate Prowler into CI/CD pipelines so that security scans run automatically and findings are sent to Prowler Cloud via [Import Findings](/user-guide/tutorials/prowler-app-import-findings). Examples cover GitHub Actions and GitLab CI.
+
+## Prerequisites
+
+* A **Prowler Cloud** account with an active subscription (see [Prowler Cloud Pricing](https://prowler.com/pricing))
+* A Prowler Cloud **API key** with the **Manage Ingestions** permission (see [API Keys](/user-guide/tutorials/prowler-app-api-keys))
+* Cloud provider credentials configured in the CI/CD environment (e.g., AWS credentials for scanning AWS accounts)
+* Access to configure pipeline workflows and secrets in the CI/CD platform
+
+## Key Concepts
+
+Prowler CLI provides the `--push-to-cloud` flag, which uploads scan results directly to Prowler Cloud after a scan completes. Combined with the `PROWLER_CLOUD_API_KEY` environment variable, this enables fully automated ingestion without manual file uploads.
+
+For full details on the flag and API, refer to the [Import Findings](/user-guide/tutorials/prowler-app-import-findings) documentation.
+
+## GitHub Actions
+
+### Store Secrets
+
+Before creating the workflow, add the following secrets to the repository (under "Settings" > "Secrets and variables" > "Actions"):
+
+* `PROWLER_CLOUD_API_KEY` — the Prowler Cloud API key
+* Cloud provider credentials (e.g., `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY`, or configure OIDC-based role assumption)
+
+### Workflow: Scheduled AWS Scan
+
+This workflow runs Prowler against an AWS account on a daily schedule and on every push to the `main` branch:
+
+```yaml
+name: Prowler Security Scan
+
+on:
+ schedule:
+ - cron: "0 3 * * *" # Daily at 03:00 UTC
+ push:
+ branches: [main]
+ workflow_dispatch: # Allow manual triggers
+
+permissions:
+ id-token: write # Required for OIDC
+ contents: read
+
+jobs:
+ prowler-scan:
+ runs-on: ubuntu-latest
+ steps:
+ - name: Configure AWS Credentials
+ uses: aws-actions/configure-aws-credentials@v4
+ with:
+ role-to-assume: arn:aws:iam::123456789012:role/ProwlerScanRole
+ aws-region: us-east-1
+
+ - name: Install Prowler
+ run: pip install prowler
+
+ - name: Run Prowler Scan
+ env:
+ PROWLER_CLOUD_API_KEY: ${{ secrets.PROWLER_CLOUD_API_KEY }}
+ run: |
+ prowler aws --push-to-cloud
+```
+
+
+Replace `123456789012` with the actual AWS account ID and `ProwlerScanRole` with the IAM role name. For IAM role setup, refer to the [AWS authentication guide](/user-guide/providers/aws/authentication).
+
+
+### Workflow: Scan Specific Services on Pull Request
+
+To run targeted scans on pull requests without blocking the merge pipeline, use `continue-on-error`:
+
+```yaml
+name: Prowler PR Check
+
+on:
+ pull_request:
+ branches: [main]
+
+jobs:
+ prowler-scan:
+ runs-on: ubuntu-latest
+ continue-on-error: true
+ steps:
+ - name: Configure AWS Credentials
+ uses: aws-actions/configure-aws-credentials@v4
+ with:
+ role-to-assume: arn:aws:iam::123456789012:role/ProwlerScanRole
+ aws-region: us-east-1
+
+ - name: Install Prowler
+ run: pip install prowler
+
+ - name: Run Prowler Scan
+ env:
+ PROWLER_CLOUD_API_KEY: ${{ secrets.PROWLER_CLOUD_API_KEY }}
+ run: |
+ prowler aws --services s3,iam,ec2 --push-to-cloud
+```
+
+
+Limiting the scan to specific services with `--services` reduces execution time, making it practical for pull request checks.
+
+
+## GitLab CI
+
+### Store Variables
+
+Add the following CI/CD variables in the GitLab project (under "Settings" > "CI/CD" > "Variables"):
+
+* `PROWLER_CLOUD_API_KEY` — mark as **masked** and **protected**
+* Cloud provider credentials as needed
+
+### Pipeline: Scheduled AWS Scan
+
+Add the following to `.gitlab-ci.yml`:
+
+```yaml
+prowler-scan:
+ image: python:3.12-slim
+ stage: test
+ script:
+ - pip install prowler
+ - prowler aws --push-to-cloud
+ variables:
+ PROWLER_CLOUD_API_KEY: $PROWLER_CLOUD_API_KEY
+ AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
+ AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
+ AWS_DEFAULT_REGION: "us-east-1"
+ rules:
+ - if: $CI_PIPELINE_SOURCE == "schedule"
+ - if: $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH
+ when: manual
+```
+
+To run the scan on a schedule, create a **Pipeline Schedule** in GitLab (under "Build" > "Pipeline Schedules") with the desired cron expression.
+
+### Pipeline: Multi-Provider Scan
+
+To scan multiple cloud providers in parallel:
+
+```yaml
+stages:
+ - security
+
+.prowler-base:
+ image: python:3.12-slim
+ stage: security
+ before_script:
+ - pip install prowler
+ rules:
+ - if: $CI_PIPELINE_SOURCE == "schedule"
+
+prowler-aws:
+ extends: .prowler-base
+ script:
+ - prowler aws --push-to-cloud
+ variables:
+ PROWLER_CLOUD_API_KEY: $PROWLER_CLOUD_API_KEY
+ AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
+ AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
+
+prowler-gcp:
+ extends: .prowler-base
+ script:
+ - prowler gcp --push-to-cloud
+ variables:
+ PROWLER_CLOUD_API_KEY: $PROWLER_CLOUD_API_KEY
+ GOOGLE_APPLICATION_CREDENTIALS: $GCP_SERVICE_ACCOUNT_KEY
+```
+
+## Tips and Best Practices
+
+### When to Run Scans
+
+* **Scheduled scans** (daily or weekly) provide continuous monitoring and are ideal for baseline security assessments
+* **On-merge scans** catch configuration changes introduced by new code
+* **Pull request scans** provide early feedback but should target specific services to keep execution times reasonable
+
+### Handling Scan Failures
+
+By default, Prowler exits with a non-zero code when it finds failing checks. This causes the CI/CD job to fail. To prevent scan results from blocking the pipeline:
+
+* **GitHub Actions**: Add `continue-on-error: true` to the job
+* **GitLab CI**: Add `allow_failure: true` to the job
+
+
+Ingestion failures (e.g., network issues reaching Prowler Cloud) do not affect the Prowler exit code. The scan completes normally and only a warning is emitted. See [Import Findings troubleshooting](/user-guide/tutorials/prowler-app-import-findings#troubleshooting) for details.
+
+
+### Caching Prowler Installation
+
+For faster pipeline runs, cache the Prowler installation:
+
+**GitHub Actions:**
+```yaml
+- name: Cache pip packages
+ uses: actions/cache@v4
+ with:
+ path: ~/.cache/pip
+ key: ${{ runner.os }}-pip-prowler
+ restore-keys: ${{ runner.os }}-pip-
+
+- name: Install Prowler
+ run: pip install prowler
+```
+
+**GitLab CI:**
+```yaml
+prowler-scan:
+ cache:
+ paths:
+ - .cache/pip
+ variables:
+ PIP_CACHE_DIR: "$CI_PROJECT_DIR/.cache/pip"
+```
+
+### Output Formats
+
+To generate additional report formats alongside the cloud upload:
+
+```bash
+prowler aws --push-to-cloud -M csv,html -o /tmp/prowler-reports
+```
+
+This produces CSV and HTML files locally while also pushing OCSF findings to Prowler Cloud. The local files can be stored as CI/CD artifacts for archival purposes.
+
+### Scanning Multiple AWS Accounts
+
+To scan multiple accounts sequentially in a single job, use [role assumption](/user-guide/providers/aws/role-assumption):
+
+```bash
+prowler aws -R arn:aws:iam::111111111111:role/ProwlerScanRole --push-to-cloud
+prowler aws -R arn:aws:iam::222222222222:role/ProwlerScanRole --push-to-cloud
+```
+
+Each scan run creates a separate ingestion job in Prowler Cloud.
diff --git a/docs/user-guide/cookbooks/kubernetes-in-cluster.mdx b/docs/user-guide/cookbooks/kubernetes-in-cluster.mdx
new file mode 100644
index 0000000000..fe18496542
--- /dev/null
+++ b/docs/user-guide/cookbooks/kubernetes-in-cluster.mdx
@@ -0,0 +1,207 @@
+---
+title: 'Run Kubernetes In-Cluster and Send Findings to Prowler Cloud'
+---
+
+This cookbook walks through deploying Prowler inside a Kubernetes cluster on a recurring schedule and automatically sending findings to Prowler Cloud via [Import Findings](/user-guide/tutorials/prowler-app-import-findings). By the end, security scan results from the cluster appear in Prowler Cloud without any manual file uploads.
+
+## Prerequisites
+
+* A **Prowler Cloud** account with an active subscription (see [Prowler Cloud Pricing](https://prowler.com/pricing))
+* A Prowler Cloud **API key** with the **Manage Ingestions** permission (see [API Keys](/user-guide/tutorials/prowler-app-api-keys))
+* Access to a Kubernetes cluster with `kubectl` configured
+* Permissions to create ServiceAccounts, Roles, RoleBindings, Secrets, and CronJobs in the cluster
+
+## Step 1: Create the ServiceAccount and RBAC Resources
+
+Prowler needs a ServiceAccount with read access to cluster resources. Apply the manifests from the [`kubernetes` directory](https://github.com/prowler-cloud/prowler/tree/master/kubernetes) of the Prowler repository:
+
+```console
+kubectl apply -f kubernetes/prowler-sa.yaml
+kubectl apply -f kubernetes/prowler-role.yaml
+kubectl apply -f kubernetes/prowler-rolebinding.yaml
+```
+
+This creates:
+
+* A `prowler-sa` ServiceAccount in the `prowler-ns` namespace
+* A ClusterRole with the read permissions Prowler requires
+* A ClusterRoleBinding linking the ServiceAccount to the role
+
+For more details on these resources, refer to [Getting Started with Kubernetes](/user-guide/providers/kubernetes/getting-started-k8s).
+
+## Step 2: Store the Prowler Cloud API Key as a Secret
+
+Create a Kubernetes Secret to hold the API key securely:
+
+```console
+kubectl create secret generic prowler-cloud-api-key \
+ --from-literal=api-key=pk_your_api_key_here \
+ --namespace prowler-ns
+```
+
+Replace `pk_your_api_key_here` with the actual API key from Prowler Cloud.
+
+
+Avoid embedding the API key directly in the CronJob manifest. Using a Kubernetes Secret keeps credentials out of version control and pod specs.
+
+
+## Step 3: Create the CronJob Manifest
+
+The CronJob runs Prowler on a schedule, scanning the cluster and pushing findings to Prowler Cloud with the `--push-to-cloud` flag.
+
+Create a file named `prowler-cronjob.yaml`:
+
+```yaml
+apiVersion: batch/v1
+kind: CronJob
+metadata:
+ name: prowler-k8s-scan
+ namespace: prowler-ns
+spec:
+ schedule: "0 2 * * *" # Runs daily at 02:00 UTC
+ concurrencyPolicy: Forbid
+ jobTemplate:
+ spec:
+ backoffLimit: 1
+ template:
+ metadata:
+ labels:
+ app: prowler
+ spec:
+ serviceAccountName: prowler-sa
+ containers:
+ - name: prowler
+ image: toniblyx/prowler:stable
+ args:
+ - "kubernetes"
+ - "--push-to-cloud"
+ env:
+ - name: PROWLER_CLOUD_API_KEY
+ valueFrom:
+ secretKeyRef:
+ name: prowler-cloud-api-key
+ key: api-key
+ - name: CLUSTER_NAME
+ value: "my-cluster"
+ imagePullPolicy: Always
+ volumeMounts:
+ - name: var-lib-cni
+ mountPath: /var/lib/cni
+ readOnly: true
+ - name: var-lib-etcd
+ mountPath: /var/lib/etcd
+ readOnly: true
+ - name: var-lib-kubelet
+ mountPath: /var/lib/kubelet
+ readOnly: true
+ - name: etc-kubernetes
+ mountPath: /etc/kubernetes
+ readOnly: true
+ hostPID: true
+ restartPolicy: Never
+ volumes:
+ - name: var-lib-cni
+ hostPath:
+ path: /var/lib/cni
+ - name: var-lib-etcd
+ hostPath:
+ path: /var/lib/etcd
+ - name: var-lib-kubelet
+ hostPath:
+ path: /var/lib/kubelet
+ - name: etc-kubernetes
+ hostPath:
+ path: /etc/kubernetes
+```
+
+
+Replace `my-cluster` with a meaningful name for the cluster. This value appears in Prowler Cloud reports and helps identify the source of findings. See the `--cluster-name` flag documentation in [Getting Started with Kubernetes](/user-guide/providers/kubernetes/getting-started-k8s) for more details.
+
+
+### Customizing the Schedule
+
+The `schedule` field uses standard cron syntax. Common examples:
+
+* `"0 2 * * *"` — daily at 02:00 UTC
+* `"0 */6 * * *"` — every 6 hours
+* `"0 2 * * 1"` — weekly on Mondays at 02:00 UTC
+
+### Scanning Specific Namespaces
+
+To limit the scan to specific namespaces, add the `--namespace` flag to the `args` array:
+
+```yaml
+args:
+ - "kubernetes"
+ - "--push-to-cloud"
+ - "--namespace"
+ - "production,staging"
+```
+
+## Step 4: Deploy and Verify
+
+Apply the CronJob to the cluster:
+
+```console
+kubectl apply -f prowler-cronjob.yaml
+```
+
+To trigger an immediate test run without waiting for the schedule:
+
+```console
+kubectl create job prowler-test-run --from=cronjob/prowler-k8s-scan -n prowler-ns
+```
+
+Monitor the job execution:
+
+```console
+kubectl get pods -n prowler-ns -l app=prowler --watch
+```
+
+Check the logs to confirm findings were pushed successfully:
+
+```console
+kubectl logs -n prowler-ns -l app=prowler --tail=50
+```
+
+A successful upload produces output similar to:
+
+```
+Pushing findings to Prowler Cloud, please wait...
+
+Findings successfully pushed to Prowler Cloud. Ingestion job: fa8bc8c5-4925-46a0-9fe0-f6575905e094
+See more details here: https://cloud.prowler.com/scans
+```
+
+## Step 5: View Findings in Prowler Cloud
+
+Once the job completes and findings are pushed:
+
+1. Navigate to [Prowler Cloud](https://cloud.prowler.com/)
+2. Open the "Scans" section to verify the ingestion job status
+3. Browse findings under the Kubernetes provider
+
+For details on the ingestion workflow and status tracking, refer to the [Import Findings](/user-guide/tutorials/prowler-app-import-findings) documentation.
+
+## Tips and Troubleshooting
+
+* **Resource limits**: For large clusters, consider setting `resources.requests` and `resources.limits` on the container to prevent the scan from consuming excessive cluster resources.
+* **Network policies**: Ensure the Prowler pod can reach `api.prowler.com` over HTTPS (port 443). Adjust NetworkPolicies or egress rules if needed.
+* **Job history**: Kubernetes retains completed and failed jobs by default. Set `successfulJobsHistoryLimit` and `failedJobsHistoryLimit` in the CronJob spec to control cleanup:
+
+ ```yaml
+ spec:
+ successfulJobsHistoryLimit: 3
+ failedJobsHistoryLimit: 1
+ ```
+
+* **API key rotation**: When rotating the API key, update the Secret and restart any running jobs:
+
+ ```console
+ kubectl delete secret prowler-cloud-api-key -n prowler-ns
+ kubectl create secret generic prowler-cloud-api-key \
+ --from-literal=api-key=pk_new_api_key_here \
+ --namespace prowler-ns
+ ```
+
+* **Failed uploads**: If the push to Prowler Cloud fails, the scan still completes and findings are saved locally in the container. Check the [Import Findings troubleshooting section](/user-guide/tutorials/prowler-app-import-findings#troubleshooting) for common error messages.