Compare commits

...

57 Commits

Author SHA1 Message Date
Pepe Fagoaga
f70cf8d81e fix(ci): Release edited (#1276) 2022-07-21 17:44:26 +02:00
Pepe Fagoaga
83b6c79203 fix(ci): Remove check-update (#1275) 2022-07-21 17:33:28 +02:00
Andrew
1192c038b2 docs(readme): Fix spelling errors (#1274) 2022-07-21 17:06:03 +02:00
Pepe Fagoaga
4ebbf6553e chore(release): 2.11.0 (#1272) 2022-07-21 10:48:32 +02:00
r8bhavneet
c501d63382 docs(readme): Fix spelling (#1271) 2022-07-21 10:42:40 +02:00
Toni de la Fuente
72d6d3f535 feat(inventory): Prowler quick inventory including IAM resources (#1258)
* chore(inventory): option included in main

* chore(inventory): quick inventory

* chore(inventory): functional version

* chore(inventory): functional version without echo

* Update include/quick_inventory

Co-authored-by: Pepe Fagoaga <pepe@verica.io>

* Update prowler

Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>

* Added new line at report line

* Added more information from IAM

Co-authored-by: Pepe Fagoaga <pepe@verica.io>
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2022-07-21 10:37:28 +02:00
Mitch
ddd34dc9cc fix(extra7173): Correct check and alternative name (#1270) 2022-07-20 08:36:34 +02:00
Sergio Garcia
03b1c10d13 fix(codebuild): expired token error using Instance Metadata
Co-authored-by: sergargar <sergio@verica.io>
2022-07-14 07:32:01 +02:00
Sergio Garcia
4cd5b8fd04 fix(codebuild): expired token error (#1262) 2022-07-12 07:38:44 +02:00
Phil Massyn
f0ce17182b feat(ecr_lifecycle): Check Lifecycle policy (#1260)
* Create checks_7194

ECR Repositories contain docker containers.  When automated processes create containers, the old ones tend to take up space.  With a lot of containers on the system, the account owner will be paying additional fees for images that are no longer in use.  By defining a lifecycle policy, a best practice is followed by reducing the total volume of data being consumed.

* Minor changes

* fix: Include bash header

Co-authored-by: n4ch04 <nachor1992@gmail.com>
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2022-07-11 13:03:31 +02:00
Sergio Garcia
2a8a7d844b fix(apigatewayv2): handle BadRequestException (#1261)
Co-authored-by: sergargar <sergio@verica.io>
2022-07-11 12:21:39 +02:00
Pepe Fagoaga
ff33f426e5 docs(readme): Update inventory and checks (#1257)
* docs(readme): Update inventory and checks

* docs(readme): inventory path

Co-authored-by: Toni de la Fuente <toni@blyx.com>

* Update README.md

Co-authored-by: Toni de la Fuente <toni@blyx.com>

Co-authored-by: Toni de la Fuente <toni@blyx.com>
2022-07-08 12:42:46 +02:00
Toni de la Fuente
f691046c1f feat(inventory): Prowler quick inventory (#1245)
* chore(inventory): option included in main

* chore(inventory): quick inventory

* chore(inventory): functional version

* chore(inventory): functional version without echo

* Update include/quick_inventory

Co-authored-by: Pepe Fagoaga <pepe@verica.io>

* Update prowler

Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>

* Added new line at report line

Co-authored-by: Pepe Fagoaga <pepe@verica.io>
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2022-07-08 12:41:54 +02:00
Pepe Fagoaga
9fad8735b8 fix(Dockerfile): Prowler path (#1254) 2022-07-07 10:03:07 +02:00
Pepe Fagoaga
c632055517 fix(dockerfile): Python path (#1250) 2022-07-06 07:54:37 +02:00
Sergio Garcia
fd850790d5 fix(add-checks-regions): Missing regions in checks (#1247)
* add regions to checks

* add root as resource

Co-authored-by: sergargar <sergio@verica.io>
2022-07-04 09:46:08 +02:00
Sergio Garcia
912d5d7f8c fix(postgres): Fix postgres connector issues. (#1244)
* fix(postgres): Fix postgres connector issues.

* fix(postgres): Update documentation

Co-authored-by: sergargar <sergio@verica.io>
2022-06-30 18:12:33 +02:00
Pepe Fagoaga
d88a136ac3 feat(db-connector): Include env variables (#1236)
* feat(db-connector): Include env variables

* fix(typo)

* fix(psql-test): Remove PGPASSWORD
2022-06-30 08:43:41 +02:00
Pepe Fagoaga
172484cf08 feat(dockerfile): Include psql client in the Prowler scanner image (#1238)
* fix(dockerignore): Include files

* fix(dockerfile): Keep python2 and organize

* feat(db-connector): Include postgres dependencies

* feat(dockerfile): Include hadolint pre-commit
2022-06-30 08:28:29 +02:00
Pepe Fagoaga
821083639a fix(bckCredentials): Do nothing if no initial creds (#1239) 2022-06-29 16:52:08 +02:00
rajarshidas
e4f0f3ec87 feat(check): Ensure default internet access from Amazon AppStream fleet should be disabled. (#1233)
Co-authored-by: sergargar <sergio@verica.io>
2022-06-29 12:51:58 +02:00
rajarshidas
cc6302f7b8 feat(checks): Amazon AppStream checks (#1216)
Co-authored-by: sergargar <sergio@verica.io>
2022-06-29 12:31:42 +02:00
Bayron Carranza
c89fd82856 feat(check7164): 365 days or more in a Cloudwatch log retention should be consider PASS (#1240)
* 365 DAYS or More Retention log group in cloudwatch

* fix(extra7162): Fix comparison errors

Also include minor changes to texts

* fix(extra7162): Set as Pass log groups that never expires

* fix(typo)

Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2022-06-28 08:58:41 +02:00
Pepe Fagoaga
0e29a92d42 fix(extra7162): Query AWS log groups using LOG_GROUP_RETENTION_PERIOD_DAYS (#1232) 2022-06-27 09:18:39 +02:00
Sergio Garcia
835d8ffe5d feat(Actions): Update refresh_aws_services_regions.yml (#1227) 2022-06-23 11:21:50 +02:00
Sergio Garcia
21ee2068a6 feat(actions): Create refresh_aws_services_regions.yml (#1225) 2022-06-23 11:07:26 +02:00
Sergio Garcia
0ad149942b fix(security_hub_integration): Treat failed findings as failed in Security Hub (#1219)
Co-authored-by: sergargar <sergio@verica.io>
2022-06-22 14:03:16 +02:00
Nacho Rivera
66305768c0 fix(instance metadata): Missing raw flag in JQ parser (#1214) 2022-06-21 10:14:12 +02:00
Sergio Garcia
05f98fe993 fix(junit_xml output): Fix XML output integration (#1210)
Co-authored-by: sergargar <sergio@verica.io>
2022-06-20 13:27:54 +02:00
rajarshidas
89416f37af feat(check): Directory Service - Ensure Radius server is using the recommended security protocol (#1203)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2022-06-20 11:37:02 +02:00
Pepe Fagoaga
7285ddcb4e feat(actions): Trigger (#1209) 2022-06-20 10:38:19 +02:00
Pepe Fagoaga
8993a4f707 fix(actions): Dockerfile path (#1208) 2022-06-20 09:22:40 +02:00
Sergio Garcia
633d7bd8a8 fix(instance-metadata): Credentials recovering (#1207)
* fix(instance-metadata): Credentials recovering

* fix(expr): Dockerfile to root and expr in SESSION_TIME_REMAINING.

Co-authored-by: Pepe Fagoaga <pepe@verica.io>
Co-authored-by: sergargar <sergio@verica.io>
2022-06-17 14:23:56 +02:00
Pepe Fagoaga
3944ea2055 fix(session_duration): Use jq with TZ=UTC (#1195) 2022-06-15 13:25:43 +02:00
zsecducna
d85d0f5877 fix(extra767): Remove false positive (#1198)
* Remove fail positive

Exclude distributions that does not support `POST` requests

* fix(extra767): Overall changes

- Quoted and braced variables
- Fix DefaultCacheBehavior twice in a AWS CLI query
- Use regex =~ to match values

* fix(check767): Change textInfo for textPass

* fix(extra767): Include AWS CLI error handling

Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2022-06-15 09:38:56 +02:00
Pepe Fagoaga
d32a7986a5 fix(shellcheck): Main variables (#1194) 2022-06-14 10:43:15 +02:00
Pepe Fagoaga
71813425bd fix(pre-commit): Recover shellcheck (#1193) 2022-06-14 07:46:12 +02:00
Pepe Fagoaga
da000b54ca refactor(Prowler): Main logic refactor (#1189)
* fix(aws_profile_loader): New functions

* fix(shellcheck): Temporary remove Shellcheck

* fix(aws_cli_detector): new function

* fix(jq_detector): New function

* fix(os_detector): New function

* fix(output_bucket): Output bucket input check in main

* fix(python_detector): deleted unused python detector

* fix(credentials): credentials check out of whoami

* [break]refactor(main)

* [BREAK] Get list of checks parsing all input options

* [break]refactor(main): execute checks functions

* [break]refactor(main): move functions to libs

* fix(validations): custom check validation and typos

* refactor(validate_options): Include comments

* fix(custom_checks): Minor fixes

* refactor(closing_files): include libraries

* refactor(loader): Include ignored checks

* refactor(main): Fix shellcheck

* refactor(loader): beautify

* refactor(monochrome): without variables

* refactor(modes): MODES array not needed

* refactor(whoami): get error from AWSCLI

* refactor(secrets-detector)

* refactor(secrets-detector)

* fix(html_scoring): html scoring was fixed.

* fix(load_checks_from_file)

* fix(color-code): Print if not mono

* fix(not extra): Fixed if EXCLUDE_CHECK_ID is empty

* fix(IFS): Restore default IFS once modes are parsed

* fix(bucket): validate before whoami

* fix(bucket): validate before whoami

Co-authored-by: n4ch04 <nachor1992@gmail.com>
Co-authored-by: sergargar <sergio@verica.io>
Co-authored-by: Nacho Rivera <59198746+n4ch04@users.noreply.github.com>
2022-06-13 17:34:31 +02:00
Sergio Garcia
74a9b42d9f Update codebuild-prowler-audit-account-cfn.yaml (#1192) 2022-06-13 12:17:31 +02:00
Nacho Rivera
f9322ab3aa fix(outputs): Replace each comma occurrence before sending to csv file (#1188) 2022-06-08 09:19:50 +02:00
Pepe Fagoaga
5becaca2c4 fix(extra7187): Remove commas from the metadata (#1187) 2022-06-08 09:02:38 +02:00
Sergio Garcia
50a670fbc4 fix(codebuild_update): AWS CLI and permissions update. (#1183) 2022-06-07 14:49:22 +02:00
Sergio Garcia
48f405a696 fix(check119_remediation): Update check remediation text. (#1185) 2022-06-07 14:48:13 +02:00
Nacho Rivera
bc56c4242e refactor(outputs): Consolidate Prowler output functions (#1180)
* chore(db providers): db providers first version

* chore(db provider): added db provider setup into Readme

* fix(csv_line): csv_line out of conditional

* fix(README): text instead of varchar in table

* fix(help): help message extended

Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>

* fix(typo): Update README.md

Co-authored-by: Pepe Fagoaga <pepe@verica.io>

* fix(table): add if not exists

Co-authored-by: Pepe Fagoaga <pepe@verica.io>

* fix(typo): Readme postgreSQL

Co-authored-by: Pepe Fagoaga <pepe@verica.io>

* fix(db_connector): details to add a new provider

* fix(typo): Uppercase Prowler

Co-authored-by: Toni de la Fuente <toni@blyx.com>

* fix(prowler): deleted unused variable

* chore(checks): test db connector previous to send data

* chore(input tests): input tests moved to main

* fix(typo): Readme typos

* chore(table): table name from pgpass file

* fix(grep test): Added missing -E flag

* chore(table): check of table name and Readme

* chore(error colors): Added error colors

* chore(inputcheck): checks about mode and output inputs into main

* fix(inputs) custom output file name

* fix(outputs): comment profile

* chore(textXXX): both 3 textfunctions using general

* fix(allowlist): allowlist check included as function

* fix(headers): Add headers to certain output files

* fix(reformulate): change structure and delete comments

* fix(testing): Input test after load includes

* fix(variables): Added named vars

* fix(colors): Deleted unused colors

* fix(outputs): fine tuning

* fix(outputs): allowlist parameters read

* fix(allowlist): allowlist logic reformulated

* fix(REPREGION): REPREGION change by REGION_FROM_CHECK

Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
Co-authored-by: Toni de la Fuente <toni@blyx.com>
2022-06-06 12:56:21 +02:00
Pepe Fagoaga
1b63256b9c fix(assume_role): Use date instead of jq (#1181)
* fix(date): Use  instead of date

* fix(assume_role): Use date instead of jq

JQ parses datetimes using the local timezone and not UTC
2022-06-03 08:31:43 -07:00
Sergio Garcia
7930b449b3 fix(apigateway_iam): Error handling and permissions for extra745. (#1176)
* fix(apigateway_iam): Error handling and permissions for extra745.

* Update check_extra745

Co-authored-by: sergargar <sergio@verica.io>
2022-06-02 15:16:43 +02:00
Pepe Fagoaga
e5cd42da55 fix(typo): Max session duration error message (#1179) 2022-06-02 15:08:30 +02:00
Sergio Garcia
2a54bbf901 fix(SQS_encryption_type): Add SQS encryption types to extra728. (#1175)
* fix(SQS_encryption_type): Add SQS encryption types to extra728.

* Update check_extra728

* Update check_extra728

Co-authored-by: sergargar <sergio@verica.io>
2022-06-02 15:01:02 +02:00
Nacho Rivera
2e134ed947 feat(db_connector): Create a PostgreSQL connector for Prowler (#1171)
* chore(db providers): db providers first version

* chore(db provider): added db provider setup into Readme

* fix(csv_line): csv_line out of conditional

* fix(README): text instead of varchar in table

* fix(help): help message extended

Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>

* fix(typo): Update README.md

Co-authored-by: Pepe Fagoaga <pepe@verica.io>

* fix(table): add if not exists

Co-authored-by: Pepe Fagoaga <pepe@verica.io>

* fix(typo): Readme postgreSQL

Co-authored-by: Pepe Fagoaga <pepe@verica.io>

* fix(db_connector): details to add a new provider

* fix(typo): Uppercase Prowler

Co-authored-by: Toni de la Fuente <toni@blyx.com>

* fix(prowler): deleted unused variable

* chore(checks): test db connector previous to send data

* chore(input tests): input tests moved to main

* fix(typo): Readme typos

* chore(table): table name from pgpass file

* fix(grep test): Added missing -E flag

* chore(table): check of table name and Readme

* chore(error colors): Added error colors

* fix(tablename): table name in readme

* fix(typo)

* fix(db_provider): Exact match

* fix(error): One line message

* chore(pgpass check): Check added for pgpass file

* fix(pgpass): pgpass file and permissions test

* fix(unused vars): Deleted unused vars

* fix(TOP_PID): Deleted TOP_PID unused var and comment

* chore(db tests): Credentials, database and table tests added

* fix(empty pgpass): Look for empty fields at pgpass file

Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
Co-authored-by: Toni de la Fuente <toni@blyx.com>
2022-06-02 13:15:14 +02:00
Sergio Garcia
ba727391db fix(runtimes_extra762): Detect nodejs versions correctly. (#1177)
Co-authored-by: sergargar <sergio@verica.io>
2022-06-02 13:14:22 +02:00
Sergio Garcia
d4346149fa fix(severity): High severity for check extra7185 (#1178) 2022-06-01 14:04:36 +02:00
Pepe Fagoaga
2637fc5132 feat(checks): New IAM privilege escalation check (#1168) 2022-06-01 13:58:31 +02:00
Sergio Garcia
ac5135470b fix(update_deprecate_runtimes): Deprecated runtimes for lambda were updated (#1170) 2022-05-31 17:03:11 +02:00
rajarshidas
613966aecf feat(check): Amazon WorkSpaces storage volumes are encrypted
If the value listed in the Volume Encryption column is Disabled, the selected AWS WorkSpaces instance volumes (root and user volumes) are not encrypted
2022-05-31 17:01:20 +02:00
Pepe Fagoaga
83ddcb9c39 feat(check): PublicAccessBlockConfiguration (#1167) 2022-05-31 16:54:05 +02:00
Lucas L Lopes
957c2433cf feat(checks): New checks for Directory Service (#1164) 2022-05-30 14:24:44 +02:00
Pepe Fagoaga
c10b367070 fix(actions): Bad PRO repository (#1163) 2022-05-25 12:47:22 +02:00
108 changed files with 2767 additions and 1366 deletions

View File

@@ -1,6 +1,17 @@
# Ignore git files
.git/
.github/
# Ignore Dodckerfile
Dockerfile
# Ignore hidden files
.pre-commit-config.yaml
.dockerignore
.gitignore
.pytest*
.DS_Store
# Ignore output directories
output/
junit-reports/

View File

@@ -9,7 +9,7 @@ on:
- 'README.md'
release:
types: [published]
types: [published, edited]
env:
AWS_REGION_STG: eu-west-1
@@ -17,7 +17,7 @@ env:
IMAGE_NAME: prowler
LATEST_TAG: latest
TEMPORARY_TAG: temporary
DOCKERFILE_PATH: util/Dockerfile
DOCKERFILE_PATH: ./Dockerfile
jobs:
# Lint Dockerfile using Hadolint
@@ -179,7 +179,7 @@ jobs:
name: Tag (release)
if: github.event_name == 'release'
run: |
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.PRO_ECR }}/${{ secrets.PRO_ECR }}:${{ github.event.release.tag_name }}
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.PRO_ECR }}/${{ secrets.PRO_ECR_REPOSITORY }}:${{ github.event.release.tag_name }}
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
-
@@ -187,7 +187,7 @@ jobs:
name: Push (release)
if: github.event_name == 'release'
run: |
docker push ${{ secrets.PRO_ECR }}/${{ secrets.PRO_ECR }}:${{ github.event.release.tag_name }}
docker push ${{ secrets.PRO_ECR }}/${{ secrets.PRO_ECR_REPOSITORY }}:${{ github.event.release.tag_name }}
docker push ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
docker push ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
-

View File

@@ -0,0 +1,50 @@
# This is a basic workflow to help you get started with Actions
name: Refresh regions of AWS services
on:
schedule:
- cron: "0 9 * * *" #runs at 09:00 UTC everyday
env:
GITHUB_BRANCH: "prowler-3.0-dev"
# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
# This workflow contains a single job called "build"
build:
# The type of runner that the job will run on
runs-on: ubuntu-latest
# Steps represent a sequence of tasks that will be executed as part of the job
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout@v3
with:
ref: ${{ env.GITHUB_BRANCH }}
- name: setup python
uses: actions/setup-python@v2
with:
python-version: 3.9 #install the python needed
# Runs a single command using the runners shell
- name: Run a one-line script
run: python3 util/update_aws_services_regions.py
# Create pull request
- name: Create Pull Request
uses: peter-evans/create-pull-request@v4
with:
token: ${{ secrets.GITHUB_TOKEN }}
commit-message: "feat(regions_update): Update regions for AWS services."
branch: "aws-services-regions-updated"
labels: "status/waiting-for-revision, severity/low"
title: "feat(regions_update): Changes in regions for AWS services."
body: |
### Description
This PR updates the regions for AWS services.
### License
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

View File

@@ -8,6 +8,22 @@ repos:
- id: check-json
- id: end-of-file-fixer
- id: trailing-whitespace
exclude: 'README.md'
- id: no-commit-to-branch
- id: pretty-format-json
args: ['--autofix']
- repo: https://github.com/koalaman/shellcheck-precommit
rev: v0.8.0
hooks:
- id: shellcheck
- repo: https://github.com/hadolint/hadolint
rev: v2.10.0
hooks:
- id: hadolint
name: Lint Dockerfiles
description: Runs hadolint to lint Dockerfiles
language: system
types: ["dockerfile"]
entry: hadolint

64
Dockerfile Normal file
View File

@@ -0,0 +1,64 @@
# Build command
# docker build --platform=linux/amd64 --no-cache -t prowler:latest -f util/Dockerfile .
# hadolint ignore=DL3007
FROM public.ecr.aws/amazonlinux/amazonlinux:latest
LABEL maintainer="https://github.com/prowler-cloud/prowler"
ARG USERNAME=prowler
ARG USERID=34000
# Prepare image as root
USER 0
# System dependencies
# hadolint ignore=DL3006,DL3013,DL3033
RUN yum upgrade -y && \
yum install -y python3 bash curl jq coreutils py3-pip which unzip shadow-utils && \
yum clean all && \
rm -rf /var/cache/yum
RUN amazon-linux-extras install -y epel postgresql14 && \
yum clean all && \
rm -rf /var/cache/yum
# Create non-root user
RUN useradd -l -s /bin/bash -U -u ${USERID} ${USERNAME}
USER ${USERNAME}
# Python dependencies
# hadolint ignore=DL3006,DL3013,DL3042
RUN pip3 install --upgrade pip && \
pip3 install --no-cache-dir boto3 detect-secrets==1.0.3 && \
pip3 cache purge
# Set Python PATH
ENV PATH="/home/${USERNAME}/.local/bin:${PATH}"
USER 0
# Install AWS CLI
RUN curl https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip -o awscliv2.zip && \
unzip -q awscliv2.zip && \
aws/install && \
rm -rf aws awscliv2.zip
# Keep Python2 for yum
RUN sed -i '1 s/python/python2.7/' /usr/bin/yum
# Set Python3
RUN rm /usr/bin/python && \
ln -s /usr/bin/python3 /usr/bin/python
# Set working directory
WORKDIR /prowler
# Copy all files
COPY . ./
# Set files ownership
RUN chown -R prowler .
USER ${USERNAME}
ENTRYPOINT ["./prowler"]

101
README.md
View File

@@ -26,7 +26,7 @@
</p>
<p align="center">
<i>Prowler</i> is an Open Source security tool to perform AWS security best practices assessments, audits, incident response, continuous monitoring, hardening and forensics readiness. It contains more than 200 controls covering CIS, PCI-DSS, ISO27001, GDPR, HIPAA, FFIEC, SOC2, AWS FTR, ENS and custome security frameworks.
<i>Prowler</i> is an Open Source security tool to perform AWS security best practices assessments, audits, incident response, continuous monitoring, hardening and forensics readiness. It contains more than 240 controls covering CIS, PCI-DSS, ISO27001, GDPR, HIPAA, FFIEC, SOC2, AWS FTR, ENS and custom security frameworks.
</p>
## Table of Contents
@@ -41,6 +41,7 @@
- [Security Hub integration](#security-hub-integration)
- [CodeBuild deployment](#codebuild-deployment)
- [Allowlist](#allowlist-or-remove-a-fail-from-resources)
- [Inventory](#inventory)
- [Fix](#how-to-fix-every-fail)
- [Troubleshooting](#troubleshooting)
- [Extras](#extras)
@@ -58,13 +59,13 @@
Prowler is a command line tool that helps you with AWS security assessment, auditing, hardening and incident response.
It follows guidelines of the CIS Amazon Web Services Foundations Benchmark (49 checks) and has more than 100 additional checks including related to GDPR, HIPAA, PCI-DSS, ISO-27001, FFIEC, SOC2 and others.
It follows guidelines of the CIS Amazon Web Services Foundations Benchmark (49 checks) and has more than 190 additional checks including related to GDPR, HIPAA, PCI-DSS, ISO-27001, FFIEC, SOC2 and others.
Read more about [CIS Amazon Web Services Foundations Benchmark v1.2.0 - 05-23-2018](https://d0.awsstatic.com/whitepapers/compliance/AWS_CIS_Foundations_Benchmark.pdf)
## Features
+200 checks covering security best practices across all AWS regions and most of AWS services and related to the next groups:
+240 checks covering security best practices across all AWS regions and most of AWS services and related to the next groups:
- Identity and Access Management [group1]
- Logging [group2]
@@ -90,6 +91,7 @@ With Prowler you can:
- Send findings directly to Security Hub
- Run specific checks and groups or create your own
- Check multiple AWS accounts in parallel or sequentially
- Get an inventory of your AWS resources
- And more! Read examples below
## High level architecture
@@ -237,7 +239,7 @@ Prowler has been written in bash using AWS-CLI underneath and it works in Linux,
By default, Prowler scans all opt-in regions available, that might take a long execution time depending on the number of resources and regions used. Same applies for GovCloud or China regions. See below Advance usage for examples.
Prowler has two parameters related to regions: `-r` that is used query AWS services API endpoints (it uses `us-east-1` by default and required for GovCloud or China) and the option `-f` that is to filter those regions you only want to scan. For example if you want to scan Dublin only use `-f eu-west-1` and if you want to scan Dublin and Ohio `-f eu-west-1,us-east-1`, note the regions are separated by a comma deliminator (it can be used as before with `-f 'eu-west-1,us-east-1'`).
Prowler has two parameters related to regions: `-r` that is used query AWS services API endpoints (it uses `us-east-1` by default and required for GovCloud or China) and the option `-f` that is to filter those regions you only want to scan. For example if you want to scan Dublin only use `-f eu-west-1` and if you want to scan Dublin and Ohio `-f eu-west-1,us-east-1`, note the regions are separated by a comma delimiter (it can be used as before with `-f 'eu-west-1,us-east-1'`).
## Screenshots
@@ -330,12 +332,72 @@ Prowler has two parameters related to regions: `-r` that is used query AWS servi
```
./prowler -h
```
## Database providers connector
You can send the Prowler's output to different databases (right now only PostgreSQL is supported).
Jump into the section for the database provider you want to use and follow the required steps to configure it.
### PostgreSQL
Install psql
- Mac -> `brew install libpq`
- Ubuntu -> `sudo apt-get install postgresql-client `
- RHEL/Centos -> `sudo yum install postgresql10`
#### Credentials
There are two options to pass the PostgreSQL credentials to Prowler:
##### Using a .pgpass file
Configure a `~/.pgpass` file into the root folder of the user that is going to launch Prowler ([pgpass file doc](https://www.postgresql.org/docs/current/libpq-pgpass.html)), including an extra field at the end of the line, separated by `:`, to name the table, using the following format:
`hostname:port:database:username:password:table`
##### Using environment variables
- Configure the following environment variables:
- `POSTGRES_HOST`
- `POSTGRES_PORT`
- `POSTGRES_USER`
- `POSTGRES_PASSWORD`
- `POSTGRES_DB`
- `POSTGRES_TABLE`
> *Note*: If you are using a schema different than postgres please include it at the beginning of the `POSTGRES_TABLE` variable, like: `export POSTGRES_TABLE=prowler.findings`
Create a table in your PostgreSQL database to store the Prowler's data. You can use the following SQL statement to create the table:
```
CREATE TABLE IF NOT EXISTS prowler_findings (
profile TEXT,
account_number TEXT,
region TEXT,
check_id TEXT,
result TEXT,
item_scored TEXT,
item_level TEXT,
check_title TEXT,
result_extended TEXT,
check_asff_compliance_type TEXT,
severity TEXT,
service_name TEXT,
check_asff_resource_type TEXT,
check_asff_type TEXT,
risk TEXT,
remediation TEXT,
documentation TEXT,
check_caf_epic TEXT,
resource_id TEXT,
prowler_start_time TEXT,
account_details_email TEXT,
account_details_name TEXT,
account_details_arn TEXT,
account_details_org TEXT,
account_details_tags TEXT
);
```
- Execute Prowler with `-d` flag, for example:
`./prowler -M csv -d postgresql`
> *Note*: This command creates a `csv` output file and stores the Prowler output in the configured PostgreSQL DB. It's an example, `-d` flag **does not** require `-M` to run.
## Advanced Usage
### Assume Role:
Prowler uses the AWS CLI underneath so it uses the same authentication methods. However, there are few ways to run Prowler against multiple accounts using IAM Assume Role feature depending on eachg use case. You can just set up your custom profile inside `~/.aws/config` with all needed information about the role to assume then call it with `./prowler -p your-custom-profile`. Additionally you can use `-A 123456789012` and `-R RemoteRoleToAssume` and Prowler will get those temporary credentials using `aws sts assume-role`, set them up as environment variables and run against that given account. To create a role to assume in multiple accounts easier eather as CFN Stack or StackSet, look at [this CloudFormation template](iam/create_role_to_assume_cfn.yaml) and adapt it.
Prowler uses the AWS CLI underneath so it uses the same authentication methods. However, there are few ways to run Prowler against multiple accounts using IAM Assume Role feature depending on eachg use case. You can just set up your custom profile inside `~/.aws/config` with all needed information about the role to assume then call it with `./prowler -p your-custom-profile`. Additionally you can use `-A 123456789012` and `-R RemoteRoleToAssume` and Prowler will get those temporary credentials using `aws sts assume-role`, set them up as environment variables and run against that given account. To create a role to assume in multiple accounts easier either as CFN Stack or StackSet, look at [this CloudFormation template](iam/create_role_to_assume_cfn.yaml) and adapt it.
```sh
./prowler -A 123456789012 -R ProwlerRole
@@ -414,7 +476,7 @@ S3 URIs are also supported as custom folders for custom checks, e.g. `s3://bucke
### Show or log only FAILs
In order to remove noise and get only FAIL findings there is a `-q` flag that makes Prowler to show and log only FAILs.
In order to remove noise and get only FAIL findings there is a `-q` flag that makes Prowler to show and log only FAILs.
It can be combined with any other option.
Will show WARNINGS when a resource is excluded, just to take into consideration.
@@ -438,12 +500,12 @@ An easy way to run Prowler to scan your account is using AWS CloudShell. Read mo
## Security Hub integration
Since October 30th 2020 (version v2.3RC5), Prowler supports natively and as **official integration** sending findings to [AWS Security Hub](https://aws.amazon.com/security-hub). This integration allows Prowler to import its findings to AWS Security Hub. With Security Hub, you now have a single place that aggregates, organizes, and prioritizes your security alerts, or findings, from multiple AWS services, such as Amazon GuardDuty, Amazon Inspector, Amazon Macie, AWS Identity and Access Management (IAM) Access Analyzer, and AWS Firewall Manager, as well as from AWS Partner solutions and from Prowler for free.
Since October 30th 2020 (version v2.3RC5), Prowler supports natively and as **official integration** sending findings to [AWS Security Hub](https://aws.amazon.com/security-hub). This integration allows Prowler to import its findings to AWS Security Hub. With Security Hub, you now have a single place that aggregates, organizes, and prioritizes your security alerts, or findings, from multiple AWS services, such as Amazon GuardDuty, Amazon Inspector, Amazon Macie, AWS Identity and Access Management (IAM) Access Analyzer, and AWS Firewall Manager, as well as from AWS Partner solutions and from Prowler for free.
Before sending findings to Prowler, you need to perform next steps:
1. Since Security Hub is a region based service, enable it in the region or regions you require. Use the AWS Management Console or using the AWS CLI with this command if you have enough permissions:
1. Since Security Hub is a region based service, enable it in the region or regions you require. Use the AWS Management Console or using the AWS CLI with this command if you have enough permissions:
- `aws securityhub enable-security-hub --region <region>`.
2. Enable Prowler as partner integration integration. Use the AWS Management Console or using the AWS CLI with this command if you have enough permissions:
2. Enable Prowler as partner integration integration. Use the AWS Management Console or using the AWS CLI with this command if you have enough permissions:
- `aws securityhub enable-import-findings-for-product --region <region> --product-arn arn:aws:securityhub:<region>::product/prowler/prowler` (change region also inside the ARN).
- Using the AWS Management Console:
![Screenshot 2020-10-29 at 10 26 02 PM](https://user-images.githubusercontent.com/3985464/97634660-5ade3400-1a36-11eb-9a92-4a45cc98c158.png)
@@ -459,7 +521,7 @@ or for only one filtered region like eu-west-1:
```sh
./prowler -M json-asff -q -S -f eu-west-1
```
> Note 1: It is recommended to send only fails to Security Hub and that is possible adding `-q` to the command.
> Note 1: It is recommended to send only fails to Security Hub and that is possible adding `-q` to the command.
> Note 2: Since Prowler perform checks to all regions by defaults you may need to filter by region when runing Security Hub integration, as shown in the example above. Remember to enable Security Hub in the region or regions you need by calling `aws securityhub enable-security-hub --region <region>` and run Prowler with the option `-f <region>` (if no region is used it will try to push findings in all regions hubs).
@@ -487,10 +549,9 @@ To use Prowler and Security Hub integration in China regions there is an additio
Either to run Prowler once or based on a schedule this template makes it pretty straight forward. This template will create a CodeBuild environment and run Prowler directly leaving all reports in a bucket and creating a report also inside CodeBuild basedon the JUnit output from Prowler. Scheduling can be cron based like `cron(0 22 * * ? *)` or rate based like `rate(5 hours)` since CloudWatch Event rules (or Eventbridge) is used here.
The Cloud Formation template that helps you doing that is [here](https://github.com/prowler-cloud/prowler/blob/master/util/codebuild/codebuild-prowler-audit-account-cfn.yaml).
The Cloud Formation template that helps you to do that is [here](https://github.com/prowler-cloud/prowler/blob/master/util/codebuild/codebuild-prowler-audit-account-cfn.yaml).
> This is a simple solution to monitor one account. For multiples accounts see [Multi Account and Continuous Monitoring](util/org-multi-account/README.md).
## Allowlist or remove a fail from resources
Sometimes you may find resources that are intentionally configured in a certain way that may be a bad practice but it is all right with it, for example an S3 bucket open to the internet hosting a web site, or a security group with an open port needed in your use case. Now you can use `-w allowlist_sample.txt` and add your resources as `checkID:resourcename` as in this command:
@@ -506,13 +567,17 @@ DynamoDB table ARNs are also supported as allowlist file, e.g. `arn:aws:dynamodb
>Make sure that the table has `account_id` as partition key and `rule` as sort key, and that the used credentials have `dynamodb:PartiQLSelect` permissions in the table.
><p align="left"><img src="https://user-images.githubusercontent.com/38561120/165769502-296f9075-7cc8-445e-8158-4b21804bfe7e.png" alt="image" width="397" height="252" /></p>
>The field `account_id` can contains either an account ID or an `*` (which applies to all the accounts that use this table as a whitelist). As in the traditional allowlist file, the `rule` field must contain `checkID:resourcename` pattern.
>The field `account_id` can contain either an account ID or an `*` (which applies to all the accounts that use this table as a whitelist). As in the traditional allowlist file, the `rule` field must contain `checkID:resourcename` pattern.
><p><img src="https://user-images.githubusercontent.com/38561120/165770610-ed5c2764-7538-44c2-9195-bcfdecc4ef9b.png" alt="image" width="394" /></p>
Allowlist option works along with other options and adds a `WARNING` instead of `INFO`, `PASS` or `FAIL` to any output format except for `json-asff`.
## Inventory
With Prowler you can get an inventory of your AWS resources. To do so, run `./prowler -i` to see what AWS resources you have deployed in your AWS account. This feature lists almost all resources in all regions based on [this](https://docs.aws.amazon.com/resourcegroupstagging/latest/APIReference/API_GetResources.html) API call. Note that it does not cover 100% of resource types.
The inventory will be stored in an output `csv` file by default, under common Prowler `output` folder, with the following format: `prowler-inventory-${ACCOUNT_NUM}-${OUTPUT_DATE}.csv`
## How to fix every FAIL
Check your report and fix the issues following all specific guidelines per check in <https://d0.awsstatic.com/whitepapers/compliance/AWS_CIS_Foundations_Benchmark.pdf>
@@ -576,7 +641,7 @@ Allows Prowler to import its findings to [AWS Security Hub](https://aws.amazon.c
### Bootstrap Script
Quick bash script to set up a "prowler" IAM user with "SecurityAudit" and "ViewOnlyAccess" group with the required permissions (including "Prowler-Additions-Policy"). To run the script below, you need user with administrative permissions; set the `AWS_DEFAULT_PROFILE` to use that account:
Quick bash script to set up a "prowler" IAM user with "SecurityAudit" and "ViewOnlyAccess" group with the required permissions (including "Prowler-Additions-Policy"). To run the script below, you need a user with administrative permissions; set the `AWS_DEFAULT_PROFILE` to use that account:
```sh
export AWS_DEFAULT_PROFILE=default
@@ -592,7 +657,7 @@ aws iam create-access-key --user-name prowler
unset ACCOUNT_ID AWS_DEFAULT_PROFILE
```
The `aws iam create-access-key` command will output the secret access key and the key id; keep these somewhere safe, and add them to `~/.aws/credentials` with an appropriate profile name to use them with Prowler. This is the only time they secret key will be shown. If you lose it, you will need to generate a replacement.
The `aws iam create-access-key` command will output the secret access key and the key id; keep these somewhere safe, and add them to `~/.aws/credentials` with an appropriate profile name to use them with Prowler. This is the only time the secret key will be shown. If you lose it, you will need to generate a replacement.
> [This CloudFormation template](iam/create_role_to_assume_cfn.yaml) may also help you on that task.
@@ -608,7 +673,7 @@ To list all existing checks in the extras group run the command below:
./prowler -l -g extras
```
>There are some checks not included in that list, they are experimental or checks that takes long to run like `extra759` and `extra760` (search for secrets in Lambda function variables and code).
>There are some checks not included in that list, they are experimental or checks that take long to run like `extra759` and `extra760` (search for secrets in Lambda function variables and code).
To check all extras in one command:
@@ -737,9 +802,9 @@ Multi Account environments assumes a minimum of two trusted or known accounts. F
![multi-account-environment](/docs/images/prowler-multi-account-environment.png)
## Custom Checks
Using `./prowler -c extra9999 -a` you can build your own on-the-fly custom check by specifying the AWS CLI command to execute.
Using `./prowler -c extra9999 -a` you can build your own on-the-fly custom check by specifying the AWS CLI command to execute.
> Omit the "aws" command and only use its parameters within quotes and do not nest quotes in the aws parameter, --output text is already included in the check.
>
>
Here is an example of a check to find SGs with inbound port 80:
```sh

View File

@@ -26,9 +26,9 @@ CHECK_CAF_EPIC_check115='IAM'
check115(){
if [[ "${REGION}" == "us-gov-west-1" || "${REGION}" == "us-gov-east-1" ]]; then
textInfo "${REGION}: This is an AWS GovCloud account and there is no root account to perform checks."
textInfo "${REGION}: This is an AWS GovCloud account and there is no root account to perform checks." "$REGION" "root"
else
# "Ensure security questions are registered in the AWS account (Not Scored)"
textInfo "${REGION}: No command available for check 1.15. Login to the AWS Console as root & click on the Account. Name -> My Account -> Configure Security Challenge Questions."
textInfo "${REGION}: No command available for check 1.15. Login to the AWS Console as root & click on the Account. Name -> My Account -> Configure Security Challenge Questions." "$REGION" "root"
fi
}

View File

@@ -26,10 +26,10 @@ CHECK_CAF_EPIC_check117='IAM'
check117(){
if [[ "${REGION}" == "us-gov-west-1" || "${REGION}" == "us-gov-east-1" ]]; then
textInfo "${REGION}: This is an AWS GovCloud account and there is no root account to perform checks."
textInfo "${REGION}: This is an AWS GovCloud account and there is no root account to perform checks." "$REGION" "root"
else
# "Maintain current contact details (Scored)"
# No command available
textInfo "No command available for check 1.17. See section 1.17 on the CIS Benchmark guide for details."
textInfo "No command available for check 1.17. See section 1.17 on the CIS Benchmark guide for details." "$REGION" "root"
fi
}

View File

@@ -26,10 +26,10 @@ CHECK_CAF_EPIC_check118='IAM'
check118(){
if [[ "${REGION}" == "us-gov-west-1" || "${REGION}" == "us-gov-east-1" ]]; then
textInfo "${REGION}: This is an AWS GovCloud account and there is no root account to perform checks."
textInfo "${REGION}: This is an AWS GovCloud account and there is no root account to perform checks." "$REGION" "root"
else
# "Ensure security contact information is registered (Scored)"
# No command available
textInfo "No command available for check 1.18. See section 1.18 on the CIS Benchmark guide for details."
textInfo "No command available for check 1.18. See section 1.18 on the CIS Benchmark guide for details." "$REGION" "root"
fi
}

View File

@@ -21,7 +21,7 @@ CHECK_ASFF_RESOURCE_TYPE_check119="AwsEc2Instance"
CHECK_ALTERNATE_check119="check119"
CHECK_SERVICENAME_check119="ec2"
CHECK_RISK_check119='AWS access from within AWS instances can be done by either encoding AWS keys into AWS API calls or by assigning the instance to a role which has an appropriate permissions policy for the required access. AWS IAM roles reduce the risks associated with sharing and rotating credentials that can be used outside of AWS itself. If credentials are compromised; they can be used from outside of the AWS account.'
CHECK_REMEDIATION_check119='IAM roles can only be associated at the launch of an instance. To remediate an instance to add it to a role you must create or re-launch a new instance. (Check for external dependencies on its current private ip or public addresses).'
CHECK_REMEDIATION_check119='Create an IAM instance role if necessary and attach it to the corresponding EC2 instance.'
CHECK_DOC_check119='http://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-ec2.html'
CHECK_CAF_EPIC_check119='IAM'

View File

@@ -46,13 +46,13 @@ check23(){
CLOUDTRAILBUCKET=$($AWSCLI cloudtrail describe-trails $PROFILE_OPT --region $TRAIL_REGION --query 'trailList[*].[S3BucketName]' --output text --trail-name-list $trail)
if [[ -z $CLOUDTRAILBUCKET ]]; then
textFail "Trail $trail in $TRAIL_REGION does not publish to S3"
textFail "Trail $trail in $TRAIL_REGION does not publish to S3" "$regx" "$trail"
continue
fi
CLOUDTRAIL_ACCOUNT_ID=$(echo $trail | awk -F: '{ print $5 }')
if [ "$CLOUDTRAIL_ACCOUNT_ID" != "$ACCOUNT_NUM" ]; then
textInfo "Trail $trail in $TRAIL_REGION S3 logging bucket $CLOUDTRAILBUCKET is not in current account"
textInfo "Trail $trail in $TRAIL_REGION S3 logging bucket $CLOUDTRAILBUCKET is not in current account" "$regx" "$trail"
continue
fi
@@ -63,7 +63,7 @@ check23(){
#
BUCKET_LOCATION=$($AWSCLI s3api get-bucket-location $PROFILE_OPT --region $regx --bucket $CLOUDTRAILBUCKET --output text 2>&1)
if [[ $(echo "$BUCKET_LOCATION" | grep AccessDenied) ]]; then
textInfo "Trail $trail in $TRAIL_REGION Access Denied getting bucket location for $CLOUDTRAILBUCKET"
textInfo "Trail $trail in $TRAIL_REGION Access Denied getting bucket location for $CLOUDTRAILBUCKET" "$regx" "$trail"
continue
fi
if [[ $BUCKET_LOCATION == "None" ]]; then
@@ -75,14 +75,14 @@ check23(){
CLOUDTRAILBUCKET_HASALLPERMISIONS=$($AWSCLI s3api get-bucket-acl --bucket $CLOUDTRAILBUCKET $PROFILE_OPT --region $BUCKET_LOCATION --query 'Grants[?Grantee.URI==`http://acs.amazonaws.com/groups/global/AllUsers`]' --output text 2>&1)
if [[ $(echo "$CLOUDTRAILBUCKET_HASALLPERMISIONS" | grep AccessDenied) ]]; then
textInfo "Trail $trail in $TRAIL_REGION Access Denied getting bucket acl for $CLOUDTRAILBUCKET"
textInfo "Trail $trail in $TRAIL_REGION Access Denied getting bucket acl for $CLOUDTRAILBUCKET" "$regx" "$trail"
continue
fi
if [[ -z $CLOUDTRAILBUCKET_HASALLPERMISIONS ]]; then
textPass "Trail $trail in $TRAIL_REGION S3 logging bucket $CLOUDTRAILBUCKET is not publicly accessible"
textPass "Trail $trail in $TRAIL_REGION S3 logging bucket $CLOUDTRAILBUCKET is not publicly accessible" "$regx" "$trail"
else
textFail "Trail $trail in $TRAIL_REGION S3 logging bucket $CLOUDTRAILBUCKET is publicly accessible"
textFail "Trail $trail in $TRAIL_REGION S3 logging bucket $CLOUDTRAILBUCKET is publicly accessible" "$regx" "$trail"
fi
done

View File

@@ -68,8 +68,7 @@ extra7100(){
done
if [[ $PERMISSIVE_POLICIES_LIST ]]; then
textInfo "STS AssumeRole Policies should only include the complete ARNs for the Roles that the user needs"
textInfo "Learn more: https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_permissions-to-switch.html#roles-usingrole-createpolicy"
textInfo "STS AssumeRole Policies should only include the complete ARNs for the Roles that the user needs. Learn more: https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_permissions-to-switch.html#roles-usingrole-createpolicy" "$REGION"
for policy in $PERMISSIVE_POLICIES_LIST; do
textFail "$REGION: Policy $policy allows permissive STS Role assumption" "$REGION" "$policy"
done

View File

@@ -23,7 +23,7 @@ CHECK_REMEDIATION_extra7102='Check Identified IPs; consider changing them to pri
CHECK_DOC_extra7102='https://www.shodan.io/'
CHECK_CAF_EPIC_extra7102='Infrastructure Security'
# Watch out, always use Shodan API key, if you use `curl https://www.shodan.io/host/{ip}` massively
# Watch out, always use Shodan API key, if you use `curl https://www.shodan.io/host/{ip}` massively
# your IP will be banned by Shodan
# This is the right way to do so
@@ -33,8 +33,8 @@ CHECK_CAF_EPIC_extra7102='Infrastructure Security'
extra7102(){
if [[ ! $SHODAN_API_KEY ]]; then
textInfo "[extra7102] Requires a Shodan API key to work. Use -N <shodan_api_key>"
else
textInfo "[extra7102] Requires a Shodan API key to work. Use -N <shodan_api_key>" "$REGION"
else
for regx in $REGIONS; do
LIST_OF_EIP=$($AWSCLI $PROFILE_OPT --region $regx ec2 describe-network-interfaces --query 'NetworkInterfaces[*].Association.PublicIp' --output text 2>&1)
if [[ $(echo "$LIST_OF_EIP" | grep -E 'AccessDenied|UnauthorizedOperation|AuthorizationError') ]]; then

View File

@@ -33,6 +33,6 @@ extra7123(){
textFail "User $user has 2 active access keys" "$REGION" "$user"
done
else
textPass "No users with 2 active access keys"
textPass "No users with 2 active access keys" "$REGION"
fi
}
}

View File

@@ -31,15 +31,15 @@ extra7125(){
for user in $LIST_USERS; do
# Would be virtual if sms-mfa or mfa, hardware is u2f or different.
MFA_TYPE=$($AWSCLI iam list-mfa-devices --user-name $user $PROFILE_OPT --region $REGION --query MFADevices[].SerialNumber --output text | awk -F':' '{ print $6 }'| awk -F'/' '{ print $1 }')
if [[ $MFA_TYPE == "mfa" || $MFA_TYPE == "sms-mfa" ]]; then
textInfo "User $user has virtual MFA enabled"
elif [[ $MFA_TYPE == "" ]]; then
if [[ $MFA_TYPE == "mfa" || $MFA_TYPE == "sms-mfa" ]]; then
textInfo "User $user has virtual MFA enabled" "$REGION" "$user"
elif [[ $MFA_TYPE == "" ]]; then
textFail "User $user has not hardware MFA enabled" "$REGION" "$user"
else
else
textPass "User $user has hardware MFA enabled" "$REGION" "$user"
fi
done
else
textPass "No users found"
textPass "No users found" "$REGION"
fi
}
}

View File

@@ -34,14 +34,14 @@ extra7132(){
for rdsinstance in ${RDS_INSTANCES}; do
RDS_NAME="$rdsinstance"
MONITORING_FLAG=$($AWSCLI rds describe-db-instances $PROFILE_OPT --region $regx --db-instance-identifier $rdsinstance --query 'DBInstances[*].[EnhancedMonitoringResourceArn]' --output text)
if [[ $MONITORING_FLAG == "None" ]];then
textFail "$regx: RDS instance: $RDS_NAME has enhanced monitoring disabled!" "$rex" "$RDS_NAME"
if [[ $MONITORING_FLAG == "None" ]];then
textFail "$regx: RDS instance: $RDS_NAME has enhanced monitoring disabled!" "$regx" "$RDS_NAME"
else
textPass "$regx: RDS instance: $RDS_NAME has enhanced monitoring enabled." "$regx" "$RDS_NAME"
fi
done
else
textInfo "$regx: no RDS instances found" "$regx" "$RDS_NAME"
textInfo "$regx: no RDS instances found" "$regx"
fi
done
}

View File

@@ -41,8 +41,8 @@ extra7142(){
textFail "$regx: Application Load Balancer $alb is not dropping invalid header fields" "$regx" "$alb"
fi
done
else
textInfo "$regx: no ALBs found"
else
textInfo "$regx: no ALBs found" "$regx"
fi
done
}

View File

@@ -29,7 +29,7 @@ extra7156(){
# "Check if API Gateway V2 has Access Logging enabled "
for regx in $REGIONS; do
LIST_OF_API_GW=$($AWSCLI apigatewayv2 get-apis $PROFILE_OPT --region $regx --query Items[*].ApiId --output text 2>&1)
if [[ $(echo "$LIST_OF_API_GW" | grep -E 'AccessDenied|UnauthorizedOperation|AuthorizationError') ]]; then
if [[ $(echo "$LIST_OF_API_GW" | grep -E 'AccessDenied|UnauthorizedOperation|AuthorizationError|BadRequestException') ]]; then
textInfo "$regx: Access Denied trying to get APIs" "$regx"
continue
fi
@@ -54,4 +54,4 @@ extra7156(){
textInfo "$regx: No API Gateway found" "$regx"
fi
done
}
}

View File

@@ -26,7 +26,7 @@ CHECK_CAF_EPIC_extra7157='IAM'
extra7157(){
for regx in $REGIONS; do
LIST_OF_API_GW=$($AWSCLI apigatewayv2 get-apis $PROFILE_OPT --region $regx --query "Items[*].ApiId" --output text 2>&1)
if [[ $(echo "$LIST_OF_API_GW" | grep -E 'AccessDenied|UnauthorizedOperation|AuthorizationError') ]]; then
if [[ $(echo "$LIST_OF_API_GW" | grep -E 'AccessDenied|UnauthorizedOperation|AuthorizationError|BadRequestException') ]]; then
textInfo "$regx: Access Denied trying to get APIs" "$regx"
continue
fi

View File

@@ -11,7 +11,7 @@
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
CHECK_ID_extra7162="7.162"
CHECK_TITLE_extra7162="[extra7162] Check if CloudWatch Log Groups have a retention policy of 365 days"
CHECK_TITLE_extra7162="[extra7162] Check if CloudWatch Log Groups have a retention policy of at least 365 days"
CHECK_SCORED_extra7162="NOT_SCORED"
CHECK_CIS_LEVEL_extra7162="EXTRA"
CHECK_SEVERITY_extra7162="Medium"
@@ -19,36 +19,57 @@ CHECK_ASFF_RESOURCE_TYPE_extra7162="AwsLogsLogGroup"
CHECK_ALTERNATE_check7162="extra7162"
CHECK_SERVICENAME_extra7162="cloudwatch"
CHECK_RISK_extra7162='If log groups have a low retention policy of less than 365 days; crucial logs and data can be lost'
CHECK_REMEDIATION_extra7162='Add Log Retention policy of 365 days to log groups. This will persist logs and traces for a long time.'
CHECK_REMEDIATION_extra7162='Add Log Retention policy of at least 365 days to log groups. This will persist logs and traces for a long time.'
CHECK_DOC_extra7162='https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/AWS_Logs.html'
CHECK_CAF_EPIC_extra7162='Data Retention'
extra7162() {
# "Check if CloudWatch Log Groups have a retention policy of 365 days"
declare -i LOG_GROUP_RETENTION_PERIOD_DAYS=365
for regx in $REGIONS; do
LIST_OF_365_RETENTION_LOG_GROUPS=$($AWSCLI logs describe-log-groups $PROFILE_OPT --region $regx --query 'logGroups[?retentionInDays=="${LOG_GROUP_RETENTION_PERIOD_DAYS}"].[logGroupName]' --output text 2>&1)
if [[ $(echo "$LIST_OF_365_RETENTION_LOG_GROUPS" | grep -E 'AccessDenied|UnauthorizedOperation|AuthorizationError') ]]; then
textInfo "$regx: Access Denied trying to describe log groups" "$regx"
local LOG_GROUP_RETENTION_PERIOD_DAYS="365"
for regx in ${REGIONS}; do
LIST_OF_365_OR_MORE_RETENTION_LOG_GROUPS=$("${AWSCLI}" logs describe-log-groups ${PROFILE_OPT} --region "${regx}" --query "logGroups[?retentionInDays>=\`${LOG_GROUP_RETENTION_PERIOD_DAYS}\`].[logGroupName]" --output text 2>&1)
if grep -E -q 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${LIST_OF_365_OR_MORE_RETENTION_LOG_GROUPS}"; then
textInfo "${regx}: Access Denied trying to describe log groups" "${regx}"
continue
fi
if [[ $LIST_OF_365_RETENTION_LOG_GROUPS ]]; then
for log in $LIST_OF_365_RETENTION_LOG_GROUPS; do
textPass "$regx: $log Log Group has 365 days retention period!" "$regx" "$log"
if [[ ${LIST_OF_365_OR_MORE_RETENTION_LOG_GROUPS} ]]; then
for log in ${LIST_OF_365_OR_MORE_RETENTION_LOG_GROUPS}; do
textPass "${regx}: ${log} Log Group has at least 365 days retention period!" "${regx}" "${log}"
done
fi
LIST_OF_NON_365_RETENTION_LOG_GROUPS=$($AWSCLI logs describe-log-groups $PROFILE_OPT --region $regx --query 'logGroups[?retentionInDays!="${LOG_GROUP_RETENTION_PERIOD_DAYS}"].[logGroupName]' --output text)
if [[ $LIST_OF_NON_365_RETENTION_LOG_GROUPS ]]; then
for log in $LIST_OF_NON_365_RETENTION_LOG_GROUPS; do
textFail "$regx: $log Log Group does not have 365 days retention period!" "$regx" "$log"
LIST_OF_NEVER_EXPIRE_RETENTION_LOG_GROUPS=$("${AWSCLI}" logs describe-log-groups ${PROFILE_OPT} --region "${regx}" --query "logGroups[?retentionInDays==null].[logGroupName]" --output text 2>&1)
if grep -E -q 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${LIST_OF_NEVER_EXPIRE_RETENTION_LOG_GROUPS}"; then
textInfo "${regx}: Access Denied trying to describe log groups" "${regx}"
continue
fi
if [[ ${LIST_OF_NEVER_EXPIRE_RETENTION_LOG_GROUPS} ]]; then
for log in ${LIST_OF_NEVER_EXPIRE_RETENTION_LOG_GROUPS}; do
textPass "${regx}: ${log} Log Group retention period never expires!" "${regx}" "${log}"
done
fi
REGION_NO_LOG_GROUP=$($AWSCLI logs describe-log-groups $PROFILE_OPT --region $regx --output text)
if [[ $REGION_NO_LOG_GROUP ]]; then
LIST_OF_NON_365_RETENTION_LOG_GROUPS=$("${AWSCLI}" logs describe-log-groups ${PROFILE_OPT} --region "${regx}" --query "logGroups[?retentionInDays<\`${LOG_GROUP_RETENTION_PERIOD_DAYS}\`].[logGroupName]" --output text 2>&1)
if grep -E -q 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${LIST_OF_NON_365_RETENTION_LOG_GROUPS}"; then
textInfo "${regx}: Access Denied trying to describe log groups" "${regx}"
continue
fi
if [[ ${LIST_OF_NON_365_RETENTION_LOG_GROUPS} ]]; then
for log in ${LIST_OF_NON_365_RETENTION_LOG_GROUPS}; do
textFail "${regx}: ${log} Log Group does not have at least 365 days retention period!" "${regx}" "${log}"
done
fi
REGION_NO_LOG_GROUP=$("${AWSCLI}" logs describe-log-groups ${PROFILE_OPT} --region "${regx}" --output text)
if grep -E -q 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${REGION_NO_LOG_GROUP}"; then
textInfo "${regx}: Access Denied trying to describe log groups" "${regx}"
continue
fi
if [[ ${REGION_NO_LOG_GROUP} ]]; then
:
else
textInfo "$regx does not have a Log Group!" "$regx"
textInfo "${regx} does not have a Log Group!" "${regx}"
fi
done
}

View File

@@ -45,6 +45,6 @@ extra7166() {
fi
done
else
textInfo "No AWS Shield Advanced subscription found. Skipping check."
textInfo "$regx: no AWS Shield Advanced subscription found. Skipping check" "$regx"
fi
}

View File

@@ -41,6 +41,6 @@ extra7167() {
textInfo "$REGION: no Cloudfront distributions found" "$REGION"
fi
else
textInfo "No AWS Shield Advanced subscription found. Skipping check."
textInfo "$REGION: no AWS Shield Advanced subscription found. Skipping check." "$REGION"
fi
}

View File

@@ -44,6 +44,6 @@ extra7168() {
textInfo "$REGION: no Route53 hosted zones found" "$REGION"
fi
else
textInfo "No AWS Shield Advanced subscription found. Skipping check."
textInfo "$REGION: no AWS Shield Advanced subscription found. Skipping check." "$REGION"
fi
}

View File

@@ -41,6 +41,6 @@ extra7169() {
textInfo "$REGION: no global accelerators found" "$REGION"
fi
else
textInfo "No AWS Shield Advanced subscription found. Skipping check."
textInfo "$REGION: no AWS Shield Advanced subscription found. Skipping check." "$REGION"
fi
}

View File

@@ -43,6 +43,6 @@ extra7170() {
fi
done
else
textInfo "No AWS Shield Advanced subscription found. Skipping check."
textInfo "$REGION: no AWS Shield Advanced subscription found. Skipping check." "$REGION"
fi
}

View File

@@ -45,6 +45,6 @@ extra7171() {
fi
done
else
textInfo "No AWS Shield Advanced subscription found. Skipping check."
textInfo "$REGION: no AWS Shield Advanced subscription found. Skipping check." "$REGION"
fi
}

View File

@@ -12,12 +12,12 @@
# specific language governing permissions and limitations under the License.
CHECK_ID_extra7173="7.173"
CHECK_TITLE_extra7173="[check7173] Security Groups created by EC2 Launch Wizard"
CHECK_TITLE_extra7173="[extra7173] Security Groups created by EC2 Launch Wizard"
CHECK_SCORED_extra7173="NOT_SCORED"
CHECK_CIS_LEVEL_extra7173="EXTRA"
CHECK_SEVERITY_extra7173="Medium"
CHECK_ASFF_RESOURCE_TYPE_extra7173="AwsEc2SecurityGroup"
CHECK_ALTERNATE_extra7173="extra7173"
CHECK_ALTERNATE_check7173="extra7173"
CHECK_SERVICENAME_extra7173="ec2"
CHECK_RISK_cextra7173="Security Groups Created on the AWS Console using the EC2 wizard may allow port 22 from 0.0.0.0/0"
CHECK_REMEDIATION_extra7173="Apply Zero Trust approach. Implement a process to scan and remediate security groups created by the EC2 Wizard. Recommended best practices is to use an authorized security group."

52
checks/check_extra7181 Normal file
View File

@@ -0,0 +1,52 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2019) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
CHECK_ID_extra7181="7.181"
CHECK_TITLE_extra7181="[extra7181] Directory Service monitoring with CloudWatch logs"
CHECK_SCORED_extra7181="NOT_SCORED"
CHECK_CIS_LEVEL_extra7181="EXTRA"
CHECK_SEVERITY_extra7181="Medium"
CHECK_ASFF_RESOURCE_TYPE_extra7181="AwsDirectoryService"
CHECK_ALTERNATE_extra7181="extra7181"
CHECK_SERVICENAME_extra7181="ds"
CHECK_RISK_cextra7181="As a best practice, monitor your organization to ensure that changes are logged. This helps you to ensure that any unexpected change can be investigated and unwanted changes can be rolled back."
CHECK_REMEDIATION_extra7181="It is recommended that that the export of logs is enabled"
CHECK_DOC_extra7181="CHECK_DOC_extra7181='https://docs.aws.amazon.com/directoryservice/latest/admin-guide/incident-response.html'"
CHECK_CAF_EPIC_extra7181="Infrastructure Security"
extra7181(){
for regx in $REGIONS; do
DIRECTORY_SERVICE_IDS=$("${AWSCLI}" ds describe-directories $PROFILE_OPT --region "${regx}" --query 'DirectoryDescriptions[*].DirectoryId[]' --output text 2>&1)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${DIRECTORY_SERVICE_IDS}"; then
textInfo "${regx}: Access Denied trying to describe directories" "${regx}"
continue
fi
if [[ ${DIRECTORY_SERVICE_IDS} ]]; then
for DIRECTORY_ID in ${DIRECTORY_SERVICE_IDS}; do
DIRECTORY_SERVICE_MONITORING=$("${AWSCLI}" ds list-log-subscriptions ${PROFILE_OPT} --region "${regx}" --directory-id "${DIRECTORY_ID}" --query 'LogSubscriptions' --output text 2>&1)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${DIRECTORY_SERVICE_MONITORING}"; then
textInfo "${regx}: Access Denied trying to list Directory Service log subscriptions" "${regx}"
continue
fi
if [[ "${DIRECTORY_SERVICE_MONITORING}" ]]; then
textPass "${regx}: Directory Service ${DIRECTORY_ID} have log forwarding to CloudWatch enabled" "${regx}" "${DIRECTORY_ID}"
else
textFail "${regx}: Directory Service ${DIRECTORY_ID} does not have log forwarding to CloudWatch enabled" "${regx}" "${DIRECTORY_ID}"
fi
done
else
textInfo "${regx}: No Directory Service found" "${regx}"
fi
done
}

52
checks/check_extra7182 Normal file
View File

@@ -0,0 +1,52 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2019) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
CHECK_ID_extra7182="7.182"
CHECK_TITLE_extra7182="[extra7182] Directory Service SNS Notifications"
CHECK_SCORED_extra7182="NOT_SCORED"
CHECK_CIS_LEVEL_extra7182="EXTRA"
CHECK_SEVERITY_extra7182="Medium"
CHECK_ASFF_RESOURCE_TYPE_extra7182="AwsDirectoryService"
CHECK_ALTERNATE_check7182="extra7182"
CHECK_SERVICENAME_extra7182="ds"
CHECK_RISK_cextra7182="As a best practice, monitor status of Directory Service. This helps to avoid late actions to fix Directory Service issues"
CHECK_REMEDIATION_extra7182="It is recommended set up SNS messaging to send email or text messages when the status of your directory changes"
CHECK_DOC_extra7182="https://docs.aws.amazon.com/directoryservice/latest/admin-guide/ms_ad_enable_notifications.html"
CHECK_CAF_EPIC_extra7182="Infrastructure Security"
extra7182(){
for regx in $REGIONS; do
DIRECTORY_SERVICE_IDS=$("${AWSCLI}" ds describe-directories ${PROFILE_OPT} --region "${regx}" --query 'DirectoryDescriptions[*].DirectoryId[]' --output text 2>&1)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${DIRECTORY_SERVICE_IDS}"; then
textInfo "${regx}: Access Denied trying to describe directories" "${regx}"
continue
fi
if [[ ${DIRECTORY_SERVICE_IDS} ]]; then
for DIRECTORY_ID in ${DIRECTORY_SERVICE_IDS}; do
DIRECTORY_SERVICE_MONITORING=$("${AWSCLI}" ds describe-event-topics ${PROFILE_OPT} --region "${regx}" --directory-id "${DIRECTORY_ID}" --query 'EventTopics' --output text 2>&1)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${DIRECTORY_SERVICE_MONITORING}"; then
textInfo "${regx}: Access Denied trying to describe Directory Service event topics" "${regx}"
continue
fi
if [[ "${DIRECTORY_SERVICE_MONITORING}" ]]; then
textPass "${regx}: Directory Service ${DIRECTORY_ID} have SNS messaging enabled" "${regx}" "${DIRECTORY_ID}"
else
textFail "${regx}: Directory Service ${DIRECTORY_ID} does not have SNS messaging enabled" "${regx}" "${DIRECTORY_ID}"
fi
done
else
textInfo "${regx}: No Directory Service found" "${regx}"
fi
done
}

67
checks/check_extra7183 Normal file
View File

@@ -0,0 +1,67 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2019) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
CHECK_ID_extra7183="7.183"
CHECK_TITLE_extra7183="[extra7183] Directory Service LDAP Certificates expiration"
CHECK_SCORED_extra7183="NOT_SCORED"
CHECK_CIS_LEVEL_extra7183="EXTRA"
CHECK_SEVERITY_extra7183="Medium"
CHECK_ASFF_RESOURCE_TYPE_extra7183="AwsDirectoryService"
CHECK_ALTERNATE_check7183="extra7183"
CHECK_SERVICENAME_extra7183="ds"
CHECK_RISK_cextra7183="Expired certificates can impact service availability."
CHECK_REMEDIATION_extra7183="Monitor certificate expiration and take automated action to alarm responsible team for taking care of the replacement or remove."
CHECK_DOC_extra7183="https://docs.aws.amazon.com/directoryservice/latest/admin-guide/ms_ad_ldap.html"
CHECK_CAF_EPIC_extra7183="Data Protection"
extra7183(){
local DAYS_TO_EXPIRE_THRESHOLD=90
for regx in $REGIONS; do
DIRECTORY_SERVICE_IDS=$("${AWSCLI}" ds describe-directories ${PROFILE_OPT} --region "${regx}" --query 'DirectoryDescriptions[*].DirectoryId[]' --output text 2>&1)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${DIRECTORY_SERVICE_IDS}"; then
textInfo "${regx}: Access Denied trying to describe directories" "${regx}"
continue
fi
if [[ ${DIRECTORY_SERVICE_IDS} ]]; then
for DIRECTORY_ID in ${DIRECTORY_SERVICE_IDS}; do
CERT_DATA=$("${AWSCLI}" ds list-certificates ${PROFILE_OPT} --region "${regx}" --directory-id "${DIRECTORY_ID}" --query 'CertificatesInfo[*].[CertificateId,ExpiryDateTime]' --output text 2>&1)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${CERT_DATA}"; then
textInfo "${regx}: Access Denied trying to list certificates" "${regx}"
continue
fi
if [[ ${CERT_DATA} ]]; then
echo "${CERT_DATA}" | while read -r CERTIFICATE_ID NOTAFTER; do
EXPIRES_DATE=$(timestamp_to_date "${NOTAFTER}")
if [[ ${EXPIRES_DATE} == "" ]]
then
textInfo "${regx}: LDAP Certificate ${CERTIFICATE_ID} has an incorrect timestamp format: ${NOTAFTER}" "${regx}" "${CERTIFICATE_ID}"
else
COUNTER_DAYS=$(how_many_days_from_today "${EXPIRES_DATE}")
if [[ "${COUNTER_DAYS}" -le "${DAYS_TO_EXPIRE_THRESHOLD}" ]]; then
textFail "${regx}: LDAP Certificate ${CERTIFICATE_ID} configured at ${DIRECTORY_ID} is about to expire in ${COUNTER_DAYS} days!" "${regx}" "${CERTIFICATE_ID}"
else
textPass "${regx}: LDAP Certificate ${CERTIFICATE_ID} configured at ${DIRECTORY_ID} expires in ${COUNTER_DAYS} days" "${regx}" "${CERTIFICATE_ID}"
fi
fi
done
else
textFail "${regx}: Directory Service ${DIRECTORY_ID} does not have a LDAP Certificate configured" "${regx}" "${DIRECTORY_ID}"
fi
done
else
textInfo "${regx}: No Directory Service found" "${regx}"
fi
done
}

63
checks/check_extra7184 Normal file
View File

@@ -0,0 +1,63 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2019) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
CHECK_ID_extra7184="7.184"
CHECK_TITLE_extra7184="[extra7184] Directory Service Manual Snapshot Limit"
CHECK_SCORED_extra7184="NOT_SCORED"
CHECK_CIS_LEVEL_extra7184="EXTRA"
CHECK_SEVERITY_extra7184="Low"
CHECK_ASFF_RESOURCE_TYPE_extra7184="AwsDirectoryService"
CHECK_ALTERNATE_check7184="extra7184"
CHECK_SERVICENAME_extra7184="ds"
CHECK_RISK_extra7184="A limit reached can bring unwanted results. The maximum number of manual snapshots is a hard limit"
CHECK_REMEDIATION_extra7184="Monitor manual snapshots limit to ensure capacity when you need it."
CHECK_DOC_extra7184="https://docs.aws.amazon.com/general/latest/gr/ds_region.html"
CHECK_CAF_EPIC_extra7184="Infrastructure Security"
extra7184(){
local THRESHOLD="2"
for regx in ${REGIONS}; do
DIRECTORY_SERVICE_IDS=$("${AWSCLI}" ds describe-directories ${PROFILE_OPT} --region "${regx}" --query 'DirectoryDescriptions[*].DirectoryId[]' --output text 2>&1)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${DIRECTORY_SERVICE_IDS}"; then
textInfo "${regx}: Access Denied trying to describe directories" "${regx}"
continue
fi
if [[ ${DIRECTORY_SERVICE_IDS} ]]; then
for DIRECTORY_ID in ${DIRECTORY_SERVICE_IDS}; do
LIMIT_DATA=$("${AWSCLI}" ds get-snapshot-limits ${PROFILE_OPT} --region "${regx}" --directory-id "${DIRECTORY_ID}" --query 'SnapshotLimits' --output text 2>&1)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${LIMIT_DATA}"; then
textInfo "${regx}: Access Denied trying to get Directiory Service snapshot limits" "${regx}"
continue
fi
echo "${LIMIT_DATA}" | while read -r CURRENT_SNAPSHOTS_COUNT SNAPSHOTS_LIMIT SNAPSHOTS_LIMIT_REACHED; do
if [[ ${SNAPSHOTS_LIMIT_REACHED} == "true" ]]
then
textFail "${regx}: Directory Service ${DIRECTORY_ID} reached ${SNAPSHOTS_LIMIT} Snapshots Limit" "${regx}" "${DIRECTORY_ID}"
else
LIMIT_REMAIN=$(("${SNAPSHOTS_LIMIT}" - "${CURRENT_SNAPSHOTS_COUNT}"))
if [[ "${LIMIT_REMAIN}" -le "${THRESHOLD}" ]]; then
textFail "${regx}: Directory Service ${DIRECTORY_ID} is about to reach ${SNAPSHOTS_LIMIT} snapshots which is the limit" "${regx}" "${DIRECTORY_ID}"
else
textPass "${regx}: Directory Service ${DIRECTORY_ID} is using ${CURRENT_SNAPSHOTS_COUNT} out of ${SNAPSHOTS_LIMIT} from the Snapshot Limit" "${regx}" "{$DIRECTORY_ID}"
fi
fi
done
done
else
textInfo "${regx}: No Directory Service found" "${regx}"
fi
done
}

85
checks/check_extra7185 Normal file
View File

@@ -0,0 +1,85 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2019) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
CHECK_ID_extra7185="7.185"
CHECK_TITLE_extra7185="[extra7185] Ensure no Customer Managed IAM policies allow actions that may lead into Privilege Escalation"
CHECK_SCORED_extra7185="NOT_SCORED"
CHECK_CIS_LEVEL_extra7185="EXTRA"
CHECK_SEVERITY_extra7185="High"
CHECK_ASFF_RESOURCE_TYPE_extra7185="AwsIamPolicy"
CHECK_ALTERNATE_check7185="extra7185"
CHECK_SERVICENAME_extra7185="iam"
CHECK_RISK_extra7185='Users with some IAM permissions are allowed to elevate their privileges up to administrator rights.'
CHECK_REMEDIATION_extra7185='Grant usage permission on a per-resource basis and applying least privilege principle.'
CHECK_DOC_extra7185='https://docs.aws.amazon.com/IAM/latest/APIReference/API_CreateAccessKey.html'
CHECK_CAF_EPIC_extra7185='IAM'
# Does the tool analyze both users and roles, or just one or the other? --> Everything using AttachementCount.
# Does the tool take a principal-centric or policy-centric approach? --> Policy-centric approach.
# Does the tool handle resource constraints? --> We don't check if the policy affects all resources or not, we check everything.
# Does the tool consider the permissions of service roles? --> Just checks policies.
# Does the tool handle transitive privesc paths (i.e., attack chains)? --> Not yet.
# Does the tool handle the DENY effect as expected? --> Yes, it checks DENY's statements with Action and NotAction.
# Does the tool handle NotAction as expected? --> Yes
# Does the tool handle Condition constraints? --> Not yet.
# Does the tool handle service control policy (SCP) restrictions? --> No, SCP are within Organizations AWS API.
extra7185() {
local PRIVILEGE_ESCALATION_IAM_ACTIONS="iam:AttachGroupPolicy|iam:SetDefaultPolicyVersion2|iam:AddUserToGroup|iam:AttachRolePolicy|iam:AttachUserPolicy|iam:CreateAccessKey|iam:CreatePolicyVersion|iam:CreateLoginProfile|iam:PassRole|iam:PutGroupPolicy|iam:PutRolePolicy|iam:PutUserPolicy|iam:SetDefaultPolicyVersion|iam:UpdateAssumeRolePolicy|iam:UpdateLoginProfile|sts:AssumeRole|ec2:RunInstances|lambda:CreateEventSourceMapping|lambda:CreateFunction|lambda:InvokeFunction|lambda:UpdateFunctionCode|dynamodb:CreateTable|dynamodb:PutItem|glue:CreateDevEndpoint|glue:GetDevEndpoint|glue:GetDevEndpoints|glue:UpdateDevEndpoint|cloudformation:CreateStack|cloudformation:DescribeStacks|datapipeline:CreatePipeline|datapipeline:PutPipelineDefinition|datapipeline:ActivatePipeline"
# Use --scope Local to list only Customer Managed Policies
# Query 'Policies[?AttachmentCount > `0`]' to check if this policy is in use, so attached to any user, group or role
LIST_CUSTOM_POLICIES=$(${AWSCLI} iam list-policies ${PROFILE_OPT} \
--scope Local \
--query 'Policies[*].[Arn,DefaultVersionId]' \
--output text)
# Check errors
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${LIST_CUSTOM_POLICIES}"; then
textInfo "${REGION}: Access Denied trying to list IAM policies" "${REGION}"
else
if [[ $LIST_CUSTOM_POLICIES ]]; then
while read -r POLICY_ARN POLICY_DEFAULT_VERSION; do
POLICY_PRIVILEGED_ACTIONS=$($AWSCLI iam get-policy-version ${PROFILE_OPT} \
--policy-arn "${POLICY_ARN}" \
--version-id "${POLICY_DEFAULT_VERSION}" \
--query "PolicyVersion.Document.Statement[]" \
--output json)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${POLICY_PRIVILEGED_ACTIONS}"; then
textInfo "${REGION}: Access Denied trying to get policy version" "${REGION}"
continue
fi
ALLOWED_ACTIONS=$(jq -r '.[] | select(."Effect" == "Allow") | .Action // empty' <<< "${POLICY_PRIVILEGED_ACTIONS}" | sed 's/\[//;s/\]//;s/,/ /;s/ //g;/^$/d')
DENIED_ACTIONS=$(jq -r '.[] | select(."Effect" == "Deny") | .Action // empty' <<< "${POLICY_PRIVILEGED_ACTIONS}" | sed 's/\[//;s/\]//;s/,/ /;s/ //g;/^$/d')
DENIED_NOT_ACTIONS=$(jq -r '.[] | select(."Effect" == "Deny") | .NotAction // empty' <<< "${POLICY_PRIVILEGED_ACTIONS}" | sed 's/\[//;s/\]//;s/,/ /;s/ //g;/^$/d')
# First, we need to perform a left join with ALLOWED_ACTIONS and DENIED_ACTIONS
LEFT_ACTIONS=$(diff <(echo "${ALLOWED_ACTIONS}") <(echo "${DENIED_ACTIONS}") | grep "^<" | sed 's/< //;s/"//g')
# Then, we need to find the DENIED_NOT_ACTIONS in LEFT_ACTIONS
PRIVILEGED_ACTIONS=$(comm -1 -2 <(sort <<< "${DENIED_NOT_ACTIONS}") <(sort <<< "${LEFT_ACTIONS}"))
# Finally, check if there is a privilege escalation action within this policy
POLICY_PRIVILEGE_ESCALATION_ACTIONS=$(grep -o -E "${PRIVILEGE_ESCALATION_IAM_ACTIONS}" <<< "${PRIVILEGED_ACTIONS}")
if [[ -n "${POLICY_PRIVILEGE_ESCALATION_ACTIONS}" ]]; then
textFail "${REGION}: Customer Managed IAM Policy ${POLICY_ARN} allows for privilege escalation using the following actions: ${POLICY_PRIVILEGE_ESCALATION_ACTIONS//$'\n'/ }" "${REGION}" "${POLICY_NAME}"
else
textPass "${REGION}: Customer Managed IAM Policy ${POLICY_ARN} not allows for privilege escalation" "${REGION}" "${POLICY_NAME}"
fi
done<<<"${LIST_CUSTOM_POLICIES}"
else
textInfo "${REGION}: No Customer Managed IAM policies found" "${REGION}"
fi
fi
}

42
checks/check_extra7186 Normal file
View File

@@ -0,0 +1,42 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2019) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
CHECK_ID_extra7186="7.186"
CHECK_TITLE_extra7186="[extra7186] Check S3 Account Level Public Access Block"
CHECK_SCORED_extra7186="NOT_SCORED"
CHECK_CIS_LEVEL_extra7186="EXTRA"
CHECK_SEVERITY_extra7186="High"
CHECK_ASFF_RESOURCE_TYPE_extra7186="AwsS3Bucket"
CHECK_ALTERNATE_check7186="extra7186"
CHECK_SERVICENAME_extra7186="s3"
CHECK_RISK_extra7186='Public access policies may be applied to sensitive data buckets'
CHECK_REMEDIATION_extra7186='You can enable Public Access Block at the account level to prevent the exposure of your data stored in S3'
CHECK_DOC_extra7186='https://docs.aws.amazon.com/AmazonS3/latest/userguide/access-control-block-public-access.html'
CHECK_CAF_EPIC_extra7186='Data Protection'
extra7186(){
S3_PUBLIC_ACCESS_BLOCK=$("${AWSCLI}" ${PROFILE_OPT} s3control get-public-access-block \
--account-id "${ACCOUNT_NUM}" \
--region "${REGION}" \
--query 'PublicAccessBlockConfiguration.[IgnorePublicAcls,RestrictPublicBuckets]' \
--output text 2>&1)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${S3_PUBLIC_ACCESS_BLOCK}"; then
textInfo "${REGION}: Access Denied trying to recover AWS account ID" "${REGION}"
exit
fi
if grep -q -E 'False|NoSuchPublicAccessBlockConfiguration' <<< "${S3_PUBLIC_ACCESS_BLOCK}"; then
textFail "${REGION}: Block Public Access is not configured for the account ${ACCOUNT_NUM}" "${REGION}" "${ACCOUNT_NUM}"
else
textPass "${REGION}: Block Public Access is configured for the account ${ACCOUNT_NUM}" "${REGION}" "${ACCOUNT_NUM}"
fi
}

53
checks/check_extra7187 Normal file
View File

@@ -0,0 +1,53 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2019) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
CHECK_ID_extra7187="7.187"
CHECK_TITLE_extra7187="[extra7187] Ensure that your Amazon WorkSpaces storage volumes are encrypted in order to meet security and compliance requirements"
CHECK_SCORED_extra7187="NOT_SCORED"
CHECK_CIS_LEVEL_extra7187="EXTRA"
CHECK_SEVERITY_extra7187="High"
CHECK_ASFF_RESOURCE_TYPE_extra7187="AwsWorkspaces"
CHECK_ALTERNATE_check7187="extra7187"
CHECK_SERVICENAME_extra7187="workspaces"
CHECK_RISK_extra7187='If the value listed in the Volume Encryption column is Disabled the selected AWS WorkSpaces instance volumes (root and user volumes) are not encrypted. Therefore your data-at-rest is not protected from unauthorized access and does not meet the compliance requirements regarding data encryption.'
CHECK_REMEDIATION_extra7187='WorkSpaces is integrated with the AWS Key Management Service (AWS KMS). This enables you to encrypt storage volumes of WorkSpaces using AWS KMS Key. When you launch a WorkSpace you can encrypt the root volume (for Microsoft Windows - the C drive; for Linux - /) and the user volume (for Windows - the D drive; for Linux - /home). Doing so ensures that the data stored at rest - disk I/O to the volume - and snapshots created from the volumes are all encrypted'
CHECK_DOC_extra7187='https://docs.aws.amazon.com/workspaces/latest/adminguide/encrypt-workspaces.html'
CHECK_CAF_EPIC_extra7187='Infrastructure Security'
extra7187(){
for regx in $REGIONS; do
RT_VOL_UNENCRYPTED_WORKSPACES_ID_LIST=$($AWSCLI workspaces describe-workspaces --query "Workspaces[?RootVolumeEncryptionEnabled!=\`true\`].WorkspaceId" ${PROFILE_OPT} --region "${regx}" --output text 2>&1)
if grep -q -E 'AccessDenied|UnauthorizedOperation|Could not connect to the endpoint URL|AuthorizationError' <<< "$RT_VOL_UNENCRYPTED_WORKSPACES_ID_LIST"; then
textInfo "$regx: Access Denied trying to describe workspaces" "$regx"
continue
fi
USERVOL_UNENCRYPTED_WORKSPACES_ID_LIST=$($AWSCLI workspaces describe-workspaces --query "Workspaces[?UserVolumeEncryptionEnabled!=\`true\`].WorkspaceId" ${PROFILE_OPT} --region "${regx}" --output text 2>&1)
if grep -q -E 'AccessDenied|UnauthorizedOperation|Could not connect to the endpoint URL|AuthorizationError' <<< "$USERVOL_UNENCRYPTED_WORKSPACES_ID_LIST"; then
textInfo "$regx: Access Denied trying to describe workspaces" "$regx"
continue
fi
if [[ $RT_VOL_UNENCRYPTED_WORKSPACES_ID_LIST ]];then
for RTVL in $RT_VOL_UNENCRYPTED_WORKSPACES_ID_LIST;do
textFail "$regx: Found WorkSpaces: $RTVL with root volume unencrypted" "$regx" "$RTVL"
done
else
textPass "$regx: No Workspaces with unencrypted root volume found" "$regx" "$RTVL"
fi
if [[ $USERVOL_UNENCRYPTED_WORKSPACES_ID_LIST ]];then
for UVL in $USERVOL_UNENCRYPTED_WORKSPACES_ID_LIST;do
textFail "$regx: Found WorkSpaces: $UVL with user volume unencrypted" "$regx" "$UVL"
done
else
textPass "$regx: No Workspaces with unencrypted user volume found" "$regx" "$UVL"
fi
done
}

60
checks/check_extra7188 Normal file
View File

@@ -0,0 +1,60 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2019) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
CHECK_ID_extra7188="7.188"
CHECK_TITLE_extra7188="[extra7188] Ensure Radius server in DS is using the recommended security protocol"
CHECK_SCORED_extra7188="NOT_SCORED"
CHECK_CIS_LEVEL_extra7188="EXTRA"
CHECK_SEVERITY_extra7188="Medium"
CHECK_ASFF_TYPE_extra7188="Software and Configuration Checks/Industry and Regulatory Standards/CIS AWS Foundations Benchmark"
CHECK_ASFF_RESOURCE_TYPE_extra7188="AwsDirectoryService"
CHECK_ALTERNATE_check7188="extra7188"
CHECK_SERVICENAME_extra7188="ds"
CHECK_RISK_extra7188="As a best practice, you might need to configure the authentication protocol between the Microsoft AD DCs and the RADIUS/MFA server. Supported protocols are PAP, CHAP MS-CHAPv1, and MS-CHAPv2. MS-CHAPv2 is recommended because it provides the strongest security of the three options."
CHECK_REMEDIATION_extra7188="MS-CHAPv2 provides the strongest security of the options supported, and is therefore recommended"
CHECK_DOC_extra7188='https://aws.amazon.com/blogs/security/how-to-enable-multi-factor-authentication-for-amazon-workspaces-and-amazon-quicksight-by-using-microsoft-ad-and-on-premises-credentials/'
CHECK_CAF_EPIC_extra7188="Infrastructure Security"
extra7188(){
for regx in $REGIONS; do
LIST_OF_DIRECTORIES=$("${AWSCLI}" ds describe-directories $PROFILE_OPT --region "${regx}" --query 'DirectoryDescriptions[*]' --output json 2>&1)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError|Could not connect to the endpoint URL' <<< "${LIST_OF_DIRECTORIES}"; then
textInfo "${regx}: Access Denied trying to describe directories" "${regx}"
continue
fi
if [[ $LIST_OF_DIRECTORIES && $LIST_OF_DIRECTORIES != '[]' ]]; then
LIST_OF_DIRECTORIES_WITHOUT_RADIUS=$(echo "${LIST_OF_DIRECTORIES}" | jq '.[] | select(.RadiusSettings == null) | {DirectoryId}' | jq -r '.DirectoryId')
LIST_OF_DIRECTORIES_WITH_RADIUS=$(echo "${LIST_OF_DIRECTORIES}" | jq '.[] | select(.RadiusSettings)')
LIST_OF_DIRECTORIES_WITH_RADIUS_RECOMMENDED_SECURITY_PROTOCOL=$(echo "${LIST_OF_DIRECTORIES_WITH_RADIUS}" | jq 'select(.RadiusSettings.AuthenticationProtocol=="MS-CHAPv2") | {DirectoryId}' | jq -r '.DirectoryId')
LIST_OF_DIRECTORIES_WITHOUT_RADIUS_RECOMMENDED_SECURITY_PROTOCOL=$(echo "${LIST_OF_DIRECTORIES_WITH_RADIUS}" | jq 'select(.RadiusSettings.AuthenticationProtocol!="MS-CHAPv2") | {DirectoryId}' | jq -r '.DirectoryId')
if [[ $LIST_OF_DIRECTORIES_WITHOUT_RADIUS_RECOMMENDED_SECURITY_PROTOCOL ]]; then
for directory in $LIST_OF_DIRECTORIES_WITHOUT_RADIUS_RECOMMENDED_SECURITY_PROTOCOL; do
textFail "$regx: Radius server of directory: ${directory} does not have recommended security protocol" "$regx" "${directory}"
done
fi
if [[ $LIST_OF_DIRECTORIES_WITH_RADIUS_RECOMMENDED_SECURITY_PROTOCOL ]]; then
for directory in $LIST_OF_DIRECTORIES_WITH_RADIUS_RECOMMENDED_SECURITY_PROTOCOL; do
textPass "$regx: Radius server of directory: ${directory} has recommended security protocol" "$regx" "${directory}"
done
fi
if [[ $LIST_OF_DIRECTORIES_WITHOUT_RADIUS ]]; then
for directory in $LIST_OF_DIRECTORIES_WITHOUT_RADIUS; do
textPass "${regx}: Directory ${directory} has not a Radius server" "${regx}" "${directory}"
done
fi
else
textPass "${regx}: No Directory Service directories found" "${regx}"
fi
done
}

60
checks/check_extra7189 Normal file
View File

@@ -0,0 +1,60 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2019) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
CHECK_ID_extra7189="7.189"
CHECK_TITLE_extra7189="[extra7189] Ensure Multi-Factor Authentication (MFA) using Radius Server is enabled in DS"
CHECK_SCORED_extra7189="NOT_SCORED"
CHECK_CIS_LEVEL_extra7189="EXTRA"
CHECK_SEVERITY_extra7189="Medium"
CHECK_ASFF_TYPE_extra7189="Software and Configuration Checks/Industry and Regulatory Standards/CIS AWS Foundations Benchmark"
CHECK_ASFF_RESOURCE_TYPE_extra7189="AwsDirectoryService"
CHECK_ALTERNATE_check7189="extra7189"
CHECK_SERVICENAME_extra7189="ds"
CHECK_RISK_extra7189="Multi-Factor Authentication (MFA) adds an extra layer of authentication assurance beyond traditional username and password."
CHECK_REMEDIATION_extra7189="Enabling MFA provides increased security to a user name and password as it requires the user to possess a solution that displays a time-sensitive authentication code."
CHECK_DOC_extra7189='https://docs.aws.amazon.com/directoryservice/latest/admin-guide/ms_ad_mfa.html'
CHECK_CAF_EPIC_extra7189="Infrastructure Security"
extra7189(){
for regx in $REGIONS; do
LIST_OF_DIRECTORIES=$("${AWSCLI}" ds describe-directories $PROFILE_OPT --region "${regx}" --query 'DirectoryDescriptions[*]' --output json 2>&1)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError|Could not connect to the endpoint URL' <<< "${LIST_OF_DIRECTORIES}"; then
textInfo "${regx}: Access Denied trying to describe directories" "${regx}"
continue
fi
if [[ $LIST_OF_DIRECTORIES && $LIST_OF_DIRECTORIES != '[]' ]]; then
LIST_OF_DIRECTORIES_WITHOUT_RADIUS=$(echo "${LIST_OF_DIRECTORIES}" | jq '.[] | select(.RadiusSettings == null) | {DirectoryId}' | jq -r '.DirectoryId')
LIST_OF_DIRECTORIES_WITH_RADIUS=$(echo "${LIST_OF_DIRECTORIES}" | jq '.[] | select(.RadiusSettings)')
LIST_OF_DIRECTORIES_WITH_RADIUS_MFA_COMPLETED=$(echo "${LIST_OF_DIRECTORIES_WITH_RADIUS}" | jq 'select(.RadiusStatus=="Completed") | {DirectoryId}' | jq -r '.DirectoryId')
LIST_OF_DIRECTORIES_WITHOUT_RADIUS_MFA_COMPLETED=$(echo "${LIST_OF_DIRECTORIES_WITH_RADIUS}" | jq 'select(.RadiusStatus!="Completed") | {DirectoryId}' | jq -r '.DirectoryId')
if [[ $LIST_OF_DIRECTORIES_WITHOUT_RADIUS_MFA_COMPLETED ]]; then
for directory in $LIST_OF_DIRECTORIES_WITHOUT_RADIUS_MFA_COMPLETED; do
textFail "$regx: Directory: ${directory} does not have Radius MFA enabled successfully" "$regx" "${directory}"
done
fi
if [[ $LIST_OF_DIRECTORIES_WITH_RADIUS_MFA_COMPLETED ]]; then
for directory in $LIST_OF_DIRECTORIES_WITH_RADIUS_MFA_COMPLETED; do
textPass "$regx: Directory: ${directory} has Radius MFA enabled" "$regx" "${directory}"
done
fi
if [[ $LIST_OF_DIRECTORIES_WITHOUT_RADIUS ]]; then
for directory in $LIST_OF_DIRECTORIES_WITHOUT_RADIUS; do
textPass "${regx}: Directory ${directory} does not have Radius Server configured" "${regx}" "${directory}"
done
fi
else
textPass "${regx}: No Directory Service directories found" "${regx}"
fi
done
}

45
checks/check_extra7190 Normal file
View File

@@ -0,0 +1,45 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2019) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
CHECK_ID_extra7190="7.190"
CHECK_TITLE_extra7190="[extra7190] Ensure user maximum session duration is no longer than 10 hours."
CHECK_SCORED_extra7190="NOT_SCORED"
CHECK_CIS_LEVEL_extra7190="EXTRA"
CHECK_SEVERITY_extra7190="Medium"
CHECK_ASFF_TYPE_extra7190="Software and Configuration Checks/Industry and Regulatory Standards/CIS AWS Foundations Benchmark"
CHECK_ASFF_RESOURCE_TYPE_extra7190="AmazonAppStream"
CHECK_ALTERNATE_check7190="extra7190"
CHECK_SERVICENAME_extra7190="appstream"
CHECK_RISK_extra7190="Having a session duration lasting longer than 10 hours should not be necessary and if running for any malicious reasons provides a greater time for usage than should be allowed."
CHECK_REMEDIATION_extra7190="Change the Maximum session duration is set to 600 minutes or less for the AppStream Fleet"
CHECK_DOC_extra7190='https://docs.aws.amazon.com/appstream2/latest/developerguide/set-up-stacks-fleets.html'
CHECK_CAF_EPIC_extra7190="Infrastructure Security"
extra7190(){
for regx in $REGIONS; do
LIST_OF_FLEETS_WITH_MAX_SESSION_DURATION_ABOVE_RECOMMENDED=$("${AWSCLI}" appstream describe-fleets $PROFILE_OPT --region "${regx}" --query 'Fleets[?MaxUserDurationInSeconds>=`36000`].Arn' --output text 2>&1)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError|Could not connect to the endpoint URL' <<< "${LIST_OF_FLEETS_WITH_MAX_SESSION_DURATION_ABOVE_RECOMMENDED}"; then
textInfo "${regx}: Access Denied trying to describe appstream fleet(s)" "${regx}"
continue
fi
if [[ $LIST_OF_FLEETS_WITH_MAX_SESSION_DURATION_ABOVE_RECOMMENDED && $LIST_OF_FLEETS_WITH_MAX_SESSION_DURATION_ABOVE_RECOMMENDED != '[]' ]]; then
for Arn in $LIST_OF_FLEETS_WITH_MAX_SESSION_DURATION_ABOVE_RECOMMENDED; do
textFail "$regx: Fleet: ${Arn} has the maximum session duration configured for longer than 10 hours duration." "$regx" "${Arn}"
done
else
textPass "${regx}: No AppStream Fleets having a maximum session duration lasting longer than 10 hours found." "${regx}"
fi
done
}

45
checks/check_extra7191 Normal file
View File

@@ -0,0 +1,45 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2019) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
CHECK_ID_extra7191="7.191"
CHECK_TITLE_extra7191="[extra7191] Ensure session disconnect timeout is set to 5 minutes or less."
CHECK_SCORED_extra7191="NOT_SCORED"
CHECK_CIS_LEVEL_extra7191="EXTRA"
CHECK_SEVERITY_extra7191="Medium"
CHECK_ASFF_TYPE_extra7191="Software and Configuration Checks/Industry and Regulatory Standards/CIS AWS Foundations Benchmark"
CHECK_ASFF_RESOURCE_TYPE_extra7191="AmazonAppStream"
CHECK_ALTERNATE_check7191="extra7191"
CHECK_SERVICENAME_extra7191="appstream"
CHECK_RISK_extra7191="Disconnect timeout in minutes, is the amount of of time that a streaming session remains active after users disconnect."
CHECK_REMEDIATION_extra7191="Change the Disconnect timeout to 5 minutes or less for the AppStream Fleet"
CHECK_DOC_extra7191='https://docs.aws.amazon.com/appstream2/latest/developerguide/set-up-stacks-fleets.html'
CHECK_CAF_EPIC_extra7191="Infrastructure Security"
extra7191(){
for regx in $REGIONS; do
LIST_OF_FLEETS_WITH_SESSION_DISCONNECT_DURATION_ABOVE_RECOMMENDED=$("${AWSCLI}" appstream describe-fleets $PROFILE_OPT --region "${regx}" --query 'Fleets[?DisconnectTimeoutInSeconds>`300`].Arn' --output text 2>&1)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError|Could not connect to the endpoint URL' <<< "${LIST_OF_FLEETS_WITH_SESSION_DISCONNECT_DURATION_ABOVE_RECOMMENDED}"; then
textInfo "${regx}: Access Denied trying to describe appstream fleet(s)" "${regx}"
continue
fi
if [[ $LIST_OF_FLEETS_WITH_SESSION_DISCONNECT_DURATION_ABOVE_RECOMMENDED && $LIST_OF_FLEETS_WITH_SESSION_DISCONNECT_DURATION_ABOVE_RECOMMENDED != '[]' ]]; then
for Arn in $LIST_OF_FLEETS_WITH_SESSION_DISCONNECT_DURATION_ABOVE_RECOMMENDED; do
textFail "$regx: Fleet: ${Arn} has the session disconnect timeout is set to more than 5 minutes." "$regx" "${Arn}"
done
else
textPass "${regx}: No AppStream Fleets having the session disconnect timeout set to more than 5 minutes found." "${regx}"
fi
done
}

45
checks/check_extra7192 Normal file
View File

@@ -0,0 +1,45 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2019) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
CHECK_ID_extra7192="7.192"
CHECK_TITLE_extra7192="[extra7192] Ensure session idle disconnect timeout is set to 10 minutes or less."
CHECK_SCORED_extra7192="NOT_SCORED"
CHECK_CIS_LEVEL_extra7192="EXTRA"
CHECK_SEVERITY_extra7192="Medium"
CHECK_ASFF_TYPE_extra7192="Software and Configuration Checks/Industry and Regulatory Standards/CIS AWS Foundations Benchmark"
CHECK_ASFF_RESOURCE_TYPE_extra7192="AmazonAppStream"
CHECK_ALTERNATE_check7192="extra7192"
CHECK_SERVICENAME_extra7192="appstream"
CHECK_RISK_extra7192="Idle disconnect timeout in minutes is the amount of time that users can be inactive before they are disconnected from their streaming session and the Disconnect timeout in minutes time begins."
CHECK_REMEDIATION_extra7192="Change the session idle timeout to 10 minutes or less for the AppStream Fleet."
CHECK_DOC_extra7192='https://docs.aws.amazon.com/appstream2/latest/developerguide/set-up-stacks-fleets.html'
CHECK_CAF_EPIC_extra7192="Infrastructure Security"
extra7192(){
for regx in $REGIONS; do
LIST_OF_FLEETS_WITH_SESSION_IDLE_DISCONNECT_DURATION_ABOVE_RECOMMENDED=$("${AWSCLI}" appstream describe-fleets $PROFILE_OPT --region "${regx}" --query 'Fleets[?IdleDisconnectTimeoutInSeconds>`600`].Arn' --output text 2>&1)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError|Could not connect to the endpoint URL' <<< "${LIST_OF_FLEETS_WITH_SESSION_IDLE_DISCONNECT_DURATION_ABOVE_RECOMMENDED}"; then
textInfo "${regx}: Access Denied trying to describe appstream fleet(s)" "${regx}"
continue
fi
if [[ $LIST_OF_FLEETS_WITH_SESSION_IDLE_DISCONNECT_DURATION_ABOVE_RECOMMENDED && $LIST_OF_FLEETS_WITH_SESSION_IDLE_DISCONNECT_DURATION_ABOVE_RECOMMENDED != '[]' ]]; then
for Arn in $LIST_OF_FLEETS_WITH_SESSION_IDLE_DISCONNECT_DURATION_ABOVE_RECOMMENDED; do
textFail "$regx: Fleet: ${Arn} has the session idle disconnect timeout is set to more than 10 minutes." "$regx" "${Arn}"
done
else
textPass "${regx}: No AppStream Fleets having the session idle disconnect timeout set to more than 10 minutes found." "${regx}"
fi
done
}

45
checks/check_extra7193 Normal file
View File

@@ -0,0 +1,45 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2019) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
CHECK_ID_extra7193="7.193"
CHECK_TITLE_extra7193="[extra7193] Ensure default Internet Access from your Amazon AppStream fleet streaming instances should remain unchecked."
CHECK_SCORED_extra7193="NOT_SCORED"
CHECK_CIS_LEVEL_extra7193="EXTRA"
CHECK_SEVERITY_extra7193="Medium"
CHECK_ASFF_TYPE_extra7193="Software and Configuration Checks/Industry and Regulatory Standards/CIS AWS Foundations Benchmark"
CHECK_ASFF_RESOURCE_TYPE_extra7193="AmazonAppStream"
CHECK_ALTERNATE_check7193="extra7193"
CHECK_SERVICENAME_extra7193="appstream"
CHECK_RISK_extra7193="Default Internet Access from your fleet streaming instances should be controlled using a NAT gateway in the VPC."
CHECK_REMEDIATION_extra7193="Uncheck the default internet access for the AppStream Fleet."
CHECK_DOC_extra7193='https://docs.aws.amazon.com/appstream2/latest/developerguide/set-up-stacks-fleets.html'
CHECK_CAF_EPIC_extra7193="Infrastructure Security"
extra7193(){
for regx in $REGIONS; do
LIST_OF_FLEETS_WITH_DEFAULT_INTERNET_ACCESS_ENABLED=$("${AWSCLI}" appstream describe-fleets $PROFILE_OPT --region "${regx}" --query 'Fleets[?EnableDefaultInternetAccess==`true`].Arn' --output text 2>&1)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError|Could not connect to the endpoint URL' <<< "${LIST_OF_FLEETS_WITH_DEFAULT_INTERNET_ACCESS_ENABLED}"; then
textInfo "${regx}: Access Denied trying to describe appstream fleet(s)" "${regx}"
continue
fi
if [[ $LIST_OF_FLEETS_WITH_DEFAULT_INTERNET_ACCESS_ENABLED && $LIST_OF_FLEETS_WITH_DEFAULT_INTERNET_ACCESS_ENABLED != '[]' ]]; then
for Arn in $LIST_OF_FLEETS_WITH_DEFAULT_INTERNET_ACCESS_ENABLED; do
textFail "$regx: Fleet: ${Arn} has default internet access enabled." "$regx" "${Arn}"
done
else
textPass "${regx}: No AppStream Fleets have default internet access enabled." "${regx}"
fi
done
}

61
checks/check_extra7194 Normal file
View File

@@ -0,0 +1,61 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2018) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
# Remediation:
#
# https://docs.aws.amazon.com/AmazonECR/latest/userguide/lp_creation.html
#
# aws ecr put-lifecycle-policy \
# --repository-name repository-name \
# --lifecycle-policy-text file://policy.json
CHECK_ID_extra7194="7.194"
CHECK_TITLE_extra7194="[extra7194] Check if ECR repositories have lifecycle policies enabled"
CHECK_SCORED_extra7194="NOT_SCORED"
CHECK_CIS_LEVEL_extra7194="EXTRA"
CHECK_SEVERITY_extra7194="Low"
CHECK_ALTERNATE_check776="extra7194"
CHECK_SERVICENAME_extra7194="ecr"
CHECK_ASFF_RESOURCE_TYPE_extra7194="AwsEcrRepository"
CHECK_RISK_extra7194='Amazon ECR repositories run the risk of retaining huge volumes of images, increasing unnecessary cost.'
CHECK_REMEDIATION_extra7194='Open the Amazon ECR console. Create an ECR lifecycle policy.'
CHECK_DOC_extra7194='https://docs.aws.amazon.com/AmazonECR/latest/userguide/LifecyclePolicies.html'
CHECK_CAF_EPIC_extra7194=''
extra7194(){
for region in ${REGIONS}; do
# List ECR repositories
LIST_ECR_REPOS=$(${AWSCLI} ecr describe-repositories ${PROFILE_OPT} --region "${region}" --query "repositories[*].[repositoryName]" --output text 2>&1)
# Handle authorization errors
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError|Could not connect to the endpoint URL' <<< "${LIST_ECR_REPOS}"; then
textInfo "${region}: Access Denied trying to describe ECR repositories" "${region}"
continue
fi
if [[ -n "${LIST_ECR_REPOS}" ]]; then
for repo in ${LIST_ECR_REPOS}; do
# Check if a lifecycle policy exists
LIFECYCLE_POLICY=$($AWSCLI ecr get-lifecycle-policy ${PROFILE_OPT} --region "${region}" --repository-name "${repo}" --query "repositoryName" --output text 2>&1)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError|Could not connect to the endpoint URL' <<< "${LIFECYCLE_POLICY}"; then
textInfo "${region}: Access Denied trying to get lifecycle policy from repository: ${repo}" "${region}"
continue
elif grep -q -E 'LifecyclePolicyNotFoundException' <<< "$LIFECYCLE_POLICY"; then
textFail "${region}: ECR repository ${repo} has no lifecycle policy" "${region}" "${repo}"
else
textPass "${region}: ECR repository ${repo} has a lifecycle policy" "${region}" "${repo}"
fi
done
else
textPass "${region}: No ECR repositories found" "${region}"
fi
done
}

View File

@@ -27,17 +27,26 @@ CHECK_CAF_EPIC_extra728='Data Protection'
extra728(){
for regx in $REGIONS; do
LIST_SQS=$($AWSCLI sqs list-queues $PROFILE_OPT --region $regx --query QueueUrls --output text 2>&1|grep -v ^None )
if [[ $(echo "$LIST_SQS" | grep -E 'AccessDenied|UnauthorizedOperation') ]]; then
LIST_SQS=$("$AWSCLI" sqs list-queues $PROFILE_OPT --region "$regx" --query QueueUrls --output text 2>&1|grep -v ^None )
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${LIST_SQS}"; then
textInfo "$regx: Access Denied trying to list queues" "$regx"
continue
fi
if [[ $LIST_SQS ]]; then
for queue in $LIST_SQS; do
# check if the policy has KmsMasterKeyId therefore SSE enabled
SSE_ENABLED_QUEUE=$($AWSCLI sqs get-queue-attributes --queue-url $queue $PROFILE_OPT --region $regx --attribute-names All --query Attributes.KmsMasterKeyId --output text|grep -v ^None)
if [[ $SSE_ENABLED_QUEUE ]]; then
textPass "$regx: SQS queue $queue is using Server Side Encryption" "$regx" "$queue"
SSE_KMS_ENABLED_QUEUE=$("$AWSCLI" sqs get-queue-attributes --queue-url "$queue" $PROFILE_OPT --region "$regx" --attribute-names All --query Attributes.KmsMasterKeyId --output text)
SSE_SQS_ENABLED_QUEUE=$("$AWSCLI" sqs get-queue-attributes --queue-url "$queue" $PROFILE_OPT --region "$regx" --attribute-names All --query Attributes.SqsManagedSseEnabled --output text)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${SSE_SQS_ENABLED_QUEUE}"; then
textInfo "$regx: Access Denied trying to list queues" "$regx"
continue
elif grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${SSE_KMS_ENABLED_QUEUE}"; then
textInfo "$regx: Access Denied trying to list queues" "$regx"
continue
elif [[ "$SSE_KMS_ENABLED_QUEUE" != 'None' ]]; then
textPass "$regx: SQS queue $queue is using KMS Server Side Encryption" "$regx" "$queue"
elif [[ "$SSE_SQS_ENABLED_QUEUE" == 'true' ]]; then
textPass "$regx: SQS queue $queue is using SQS Server Side Encryption" "$regx" "$queue"
else
textFail "$regx: SQS queue $queue is not using Server Side Encryption" "$regx" "$queue"
fi

View File

@@ -36,6 +36,6 @@ extra732(){
fi
done
else
textInfo "$REGION: No CloudFront distributions found"
textInfo "$REGION: No CloudFront distributions found" "$REGION" "$ACCOUNT_NUM"
fi
}

View File

@@ -26,7 +26,7 @@ CHECK_CAF_EPIC_extra740='Data Protection'
extra740(){
# This does NOT use max-items, which would limit the number of items
# considered. It considers all snapshots, but only reports at most
# considered. It considers all snapshots, but only reports at most
# max-items passing and max-items failing.
for regx in ${REGIONS}; do
UNENCRYPTED_SNAPSHOTS=$(${AWSCLI} ec2 describe-snapshots ${PROFILE_OPT} \
@@ -36,8 +36,8 @@ extra740(){
if [[ $(echo "$UNENCRYPTED_SNAPSHOTS" | grep -E 'AccessDenied|UnauthorizedOperation|AuthorizationError') ]]; then
textInfo "$regx: Access Denied trying to describe snapshots" "$regx"
continue
fi
fi
ENCRYPTED_SNAPSHOTS=$(${AWSCLI} ec2 describe-snapshots ${PROFILE_OPT} \
--region ${regx} --owner-ids ${ACCOUNT_NUM} --output text \
--query 'Snapshots[?Encrypted==`true`]|[*].{Id:SnapshotId}' 2>&1 \
@@ -45,7 +45,7 @@ extra740(){
if [[ $(echo "$ENCRYPTED_SNAPSHOTS" | grep -E 'AccessDenied|UnauthorizedOperation|AuthorizationError') ]]; then
textInfo "$regx: Access Denied trying to describe snapshots" "$regx"
continue
fi
fi
typeset -i unencrypted
typeset -i encrypted
unencrypted=0
@@ -73,10 +73,10 @@ extra740(){
typeset -i total
total=${encrypted}+${unencrypted}
if [[ "${unencrypted}" -ge "${MAXITEMS}" ]]; then
textFail "${unencrypted} unencrypted snapshots out of ${total} snapshots found. Only the first ${MAXITEMS} unencrypted snapshots are reported!"
textFail "${unencrypted} unencrypted snapshots out of ${total} snapshots found. Only the first ${MAXITEMS} unencrypted snapshots are reported!" "${regx}"
fi
if [[ "${encrypted}" -ge "${MAXITEMS}" ]]; then
textPass "${encrypted} encrypted snapshots out of ${total} snapshots found. Only the first ${MAXITEMS} encrypted snapshots are reported."
textPass "${encrypted} encrypted snapshots out of ${total} snapshots found. Only the first ${MAXITEMS} encrypted snapshots are reported." "${regx}"
fi
# Bit of 'bc' magic to print something like 10.42% or 0.85% or similar. 'bc' has a
# bug where it will never print leading zeros. So 0.5 is output as ".5". This has a

View File

@@ -18,23 +18,29 @@ CHECK_SEVERITY_extra745="Medium"
CHECK_ASFF_RESOURCE_TYPE_extra745="AwsApiGatewayRestApi"
CHECK_ALTERNATE_check745="extra745"
CHECK_SERVICENAME_extra745="apigateway"
CHECK_RISK_extra745='If accessible from internet without restrictions opens up attack / abuse surface for any malicious user.'
CHECK_REMEDIATION_extra745='Verify that any public Api Gateway is protected and audited. Detective controls for common risks should be implemented.'
CHECK_RISK_extra745='If accessible from internet opens up attack / abuse surface for any malicious user.'
CHECK_REMEDIATION_extra745='Make Api Gateway private if a public endpoint is not necessary. Detective controls for common risks should be implemented.'
CHECK_DOC_extra745='https://d1.awsstatic.com/whitepapers/api-gateway-security.pdf?svrd_sip6'
CHECK_CAF_EPIC_extra745='Infrastructure Security'
extra745(){
for regx in $REGIONS; do
LIST_OF_REST_APIS=$($AWSCLI $PROFILE_OPT --region $regx apigateway get-rest-apis --query 'items[*].id' --output text 2>&1)
if [[ $(echo "$LIST_OF_REST_APIS" | grep -E 'AccessDenied|UnauthorizedOperation|AuthorizationError') ]]; then
LIST_OF_REST_APIS=$("$AWSCLI" $PROFILE_OPT --region "$regx" apigateway get-rest-apis --query 'items[*].id' --output text 2>&1)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "$LIST_OF_REST_APIS"; then
textInfo "$regx: Access Denied trying to get rest APIs" "$regx"
continue
fi
if [[ $LIST_OF_REST_APIS ]];then
for api in $LIST_OF_REST_APIS; do
API_GW_NAME=$($AWSCLI apigateway get-rest-apis $PROFILE_OPT --region $regx --query "items[?id==\`$api\`].name" --output text)
ENDPOINT_CONFIG_TYPE=$($AWSCLI $PROFILE_OPT --region $regx apigateway get-rest-api --rest-api-id $api --query endpointConfiguration.types --output text)
if [[ $ENDPOINT_CONFIG_TYPE ]]; then
API_GW_NAME=$("$AWSCLI" apigateway get-rest-apis $PROFILE_OPT --region $regx --query "items[?id==\`$api\`].name" --output text)
ENDPOINT_CONFIG_TYPE=$("$AWSCLI" $PROFILE_OPT --region "$regx" apigateway get-rest-api --rest-api-id "$api" --query endpointConfiguration.types --output text)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "$API_GW_NAME"; then
textInfo "$regx: Access Denied trying to get rest APIs" "$regx"
continue
elif grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "$ENDPOINT_CONFIG_TYPE"; then
textInfo "$regx: Access Denied trying to get rest APIs" "$regx"
continue
elif [[ $ENDPOINT_CONFIG_TYPE ]]; then
case $ENDPOINT_CONFIG_TYPE in
PRIVATE )
textPass "$regx: API Gateway $API_GW_NAME ID $api is set as $ENDPOINT_CONFIG_TYPE" "$regx" "$API_GW_NAME"

View File

@@ -27,24 +27,22 @@ extra762(){
# regex to match OBSOLETE runtimes in string functionName%runtime
# https://docs.aws.amazon.com/lambda/latest/dg/runtime-support-policy.html
OBSOLETE='%(nodejs4.3|nodejs4.3-edge|nodejs6.10|nodejs8.10|dotnetcore1.0|dotnetcore2.0)'
OBSOLETE='nodejs4.3|nodejs4.3-edge|nodejs6.10|nodejs8.10|dotnetcore1.0|dotnetcore2.0|dotnetcore2.1|python3.6|python2.7|ruby2.5|nodejs10.x|nodejs'
for regx in $REGIONS; do
LIST_OF_FUNCTIONS=$($AWSCLI lambda list-functions $PROFILE_OPT --region $regx --output text --query 'Functions[*].{R:Runtime,N:FunctionName}' 2>&1| tr "\t" "%" )
if [[ $(echo "$LIST_OF_FUNCTIONS" | grep -E 'AccessDenied|UnauthorizedOperation|AuthorizationError') ]]; then
LIST_OF_FUNCTIONS=$("$AWSCLI" lambda list-functions $PROFILE_OPT --region "$regx" --query 'Functions[*].[Runtime,FunctionName]' --output text 2>&1)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "$LIST_OF_FUNCTIONS"; then
textInfo "$regx: Access Denied trying to list functions" "$regx"
continue
fi
if [[ $LIST_OF_FUNCTIONS ]]; then
for lambdafunction in $LIST_OF_FUNCTIONS;do
fname=$(echo "$lambdafunction" | cut -d'%' -f1)
runtime=$(echo "$lambdafunction" | cut -d'%' -f2)
if echo "$lambdafunction" | grep -Eq $OBSOLETE ; then
textFail "$regx: Obsolete runtime: ${runtime} used by: ${fname}" "$regx" "${fname}"
while read -r FUNCTION_RUNTIME FUNCTION_NAME; do
if grep -wEq $OBSOLETE <<< "${FUNCTION_RUNTIME}" ; then
textFail "$regx: Obsolete runtime: ${FUNCTION_RUNTIME} used by: ${FUNCTION_NAME}" "$regx" "${FUNCTION_NAME}"
else
textPass "$regx: Supported runtime: ${runtime} used by: ${fname}" "$regx" "${fname}"
textPass "$regx: Supported runtime: ${FUNCTION_RUNTIME} used by: ${FUNCTION_NAME}" "$regx" "${FUNCTION_NAME}"
fi
done
done<<<"${LIST_OF_FUNCTIONS}"
else
textInfo "$regx: No Lambda functions found" "$regx"
fi

View File

@@ -41,7 +41,7 @@ extra763(){
BUCKET_VERSIONING_ENABLED=$("${AWSCLI}" s3api get-bucket-versioning --bucket "${bucket}" ${PROFILE_OPT} --region "${BUCKET_REGION}" --query Status --output text 2>&1)
if grep -q 'AccessDenied' <<< "${BUCKET_VERSIONING_ENABLED}"; then
textInfo "${BUCKET_REGION}: Access Denied Trying to Get Bucket Versioning for $bucket"
textInfo "${BUCKET_REGION}: Access Denied Trying to Get Bucket Versioning for $bucket" "${REGION}"
continue
fi
if grep -q "^Enabled$" <<< "${BUCKET_VERSIONING_ENABLED}"; then

View File

@@ -24,17 +24,36 @@ CHECK_DOC_extra767='https://docs.aws.amazon.com/AmazonCloudFront/latest/Develope
CHECK_CAF_EPIC_extra767='Data Protection'
extra767(){
LIST_OF_DISTRIBUTIONS=$($AWSCLI cloudfront list-distributions --query 'DistributionList.Items[*].Id' $PROFILE_OPT --output text|grep -v ^None)
LIST_OF_DISTRIBUTIONS=$("${AWSCLI}" cloudfront list-distributions --query 'DistributionList.Items[*].Id' ${PROFILE_OPT} --output text | grep -v ^None)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${LIST_OF_DISTRIBUTIONS}"; then
textInfo "${REGION}: Access Denied Trying to list CloudFront distributions" "${REGION}"
exit
fi
if [[ $LIST_OF_DISTRIBUTIONS ]];then
for dist in $LIST_OF_DISTRIBUTIONS; do
CHECK_FLE=$($AWSCLI cloudfront get-distribution --id $dist --query Distribution.DistributionConfig.DefaultCacheBehavior.FieldLevelEncryptionId $PROFILE_OPT --output text)
if [[ $CHECK_FLE ]]; then
textPass "CloudFront distribution $dist has Field Level Encryption enabled" "$regx" "$dist"
for distribution in $LIST_OF_DISTRIBUTIONS; do
CHECK_ALLOWED_METHODS=$("${AWSCLI}" cloudfront get-distribution --id "${distribution}" --query Distribution.DistributionConfig.DefaultCacheBehavior.AllowedMethods.Items ${PROFILE_OPT} --output text)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${CHECK_ALLOWED_METHODS}"; then
textInfo "${REGION}: Access Denied Trying to get CloudFront distributions" "${REGION}"
continue
fi
if [[ "$CHECK_ALLOWED_METHODS" =~ "POST" ]]; then
CHECK_FIELD_LEVEL_ENCRYPTION=$("${AWSCLI}" cloudfront get-distribution --id "${distribution}" --query Distribution.DistributionConfig.DefaultCacheBehavior.FieldLevelEncryptionId ${PROFILE_OPT} --output text)
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${CHECK_FIELD_LEVEL_ENCRYPTION}"; then
textInfo "${REGION}: Access Denied Trying to get CloudFront distributions" "${REGION}"
continue
fi
if [[ $CHECK_FIELD_LEVEL_ENCRYPTION ]]; then
textPass "CloudFront distribution ${distribution} has Field Level Encryption enabled" "${REGION}" "${distribution}"
else
textFail "CloudFront distribution ${distribution} has Field Level Encryption disabled!" "${REGION}" "${distribution}"
fi
else
textFail "CloudFront distribution $dist has Field Level Encryption disabled!" "$regx" "$dist"
textPass "CloudFront distribution ${distribution} does not support POST requests"
fi
done
else
textInfo "No CloudFront distributions found" "$regx"
textPass "No CloudFront distributions found" "${REGION}"
fi
}

View File

@@ -37,6 +37,6 @@ extra773(){
fi
done
else
textInfo "No CloudFront distributions found"
textInfo "No CloudFront distributions found" "us-east-1"
fi
}

View File

@@ -14,7 +14,6 @@
GROUP_ID[10]='hipaa'
GROUP_NUMBER[10]='10.0'
GROUP_TITLE[10]='HIPAA Compliance - ONLY AS REFERENCE - [hipaa] ****************'
GROUP_RUN_BY_DEFAULT[10]='N' # run it when execute_all is called
GROUP_CHECKS[10]='check12,check113,check23,check26,check27,check29,extra718,extra725,extra72,extra75,extra717,extra729,extra734,check38,extra73,extra740,extra735,check112,check13,check15,check16,check17,check18,check19,check21,check24,check28,check31,check310,check311,check312,check313,check314,check32,check33,check34,check35,check36,check37,check39,extra792'
# Resources:

View File

@@ -14,9 +14,7 @@
GROUP_ID[11]='secrets'
GROUP_NUMBER[11]='11.0'
GROUP_TITLE[11]='Look for keys secrets or passwords around resources - [secrets]'
GROUP_RUN_BY_DEFAULT[11]='N' # but it runs when execute_all is called (default)
GROUP_CHECKS[11]='extra741,extra742,extra759,extra760,extra768,extra775,extra7141'
# requires https://github.com/Yelp/detect-secrets
# `pip install detect-secrets`
# `pip install detect-secrets`

View File

@@ -14,6 +14,4 @@
GROUP_ID[12]='apigateway'
GROUP_NUMBER[12]='12.0'
GROUP_TITLE[12]='API Gateway security checks - [apigateway] ********************'
GROUP_RUN_BY_DEFAULT[12]='N' # run it when execute_all is called
GROUP_CHECKS[12]='extra722,extra743,extra744,extra745,extra746'

View File

@@ -14,5 +14,4 @@
GROUP_ID[13]='rds'
GROUP_NUMBER[13]='13.0'
GROUP_TITLE[13]='RDS security checks - [rds] ***********************************'
GROUP_RUN_BY_DEFAULT[13]='N' # run it when execute_all is called
GROUP_CHECKS[13]='extra78,extra723,extra735,extra739,extra747,extra7113,extra7131,extra7132,extra7133'

View File

@@ -14,5 +14,4 @@
GROUP_ID[14]='elasticsearch'
GROUP_NUMBER[14]='14.0'
GROUP_TITLE[14]='Elasticsearch related security checks - [elasticsearch] *******'
GROUP_RUN_BY_DEFAULT[14]='N' # run it when execute_all is called
GROUP_CHECKS[14]='extra715,extra716,extra779,extra780,extra781,extra782,extra783,extra784,extra785,extra787,extra788,extra7101'
GROUP_CHECKS[14]='extra715,extra716,extra779,extra780,extra781,extra782,extra783,extra784,extra785,extra787,extra788,extra7101'

View File

@@ -14,7 +14,6 @@
GROUP_ID[15]='pci'
GROUP_NUMBER[15]='15.0'
GROUP_TITLE[15]='PCI-DSS v3.2.1 Readiness - ONLY AS REFERENCE - [pci] **********'
GROUP_RUN_BY_DEFAULT[15]='N' # run it when execute_all is called
GROUP_CHECKS[15]='check11,check12,check13,check14,check15,check16,check17,check18,check19,check110,check112,check113,check114,check116,check21,check23,check25,check26,check27,check28,check29,check314,check36,check38,check43,extra711,extra713,extra717,extra718,extra72,extra729,extra735,extra738,extra740,extra744,extra748,extra75,extra750,extra751,extra753,extra754,extra755,extra773,extra78,extra780,extra781,extra782,extra783,extra784,extra785,extra787,extra788,extra798'
# Resources:

View File

@@ -14,7 +14,6 @@
GROUP_ID[16]='trustboundaries'
GROUP_NUMBER[16]='16.0'
GROUP_TITLE[16]='Find cross-account trust boundaries - [trustboundaries] *******'
GROUP_RUN_BY_DEFAULT[16]='N' # run it when execute_all is called
GROUP_CHECKS[16]='extra789,extra790'
# Single account environment: No action required. The AWS account number will be automatically added by the checks.

View File

@@ -14,7 +14,6 @@
GROUP_ID[17]='internet-exposed'
GROUP_NUMBER[17]='17.0'
GROUP_TITLE[17]='Find resources exposed to the internet - [internet-exposed] ***'
GROUP_RUN_BY_DEFAULT[17]='N' # run it when execute_all is called
GROUP_CHECKS[17]='check41,check42,check45,check46,extra72,extra73,extra74,extra76,extra77,extra78,extra79,extra710,extra711,extra716,extra723,extra727,extra731,extra736,extra738,extra745,extra748,extra749,extra750,extra751,extra752,extra753,extra754,extra755,extra770,extra771,extra778,extra779,extra787,extra788,extra795,extra796,extra798,extra7102,extra7134,extra7135,extra7136,extra7137,extra7138'
# 4.1 [check41] Ensure no security groups allow ingress from 0.0.0.0/0 or ::/0 to port 22 (Scored) [group4, cislevel1, cislevel2]

View File

@@ -14,7 +14,6 @@
GROUP_ID[18]='iso27001'
GROUP_NUMBER[18]='18.0'
GROUP_TITLE[18]='ISO 27001:2013 Readiness - ONLY AS REFERENCE - [iso27001] *****'
GROUP_RUN_BY_DEFAULT[18]='N' # run it when execute_all is called
GROUP_CHECKS[18]='check11,check110,check111,check112,check113,check114,check115,check116,check119,check12,check122,check13,check14,check15,check16,check17,check18,check19,check21,check22,check23,check24,check25,check26,check27,check28,check29,check31,check310,check311,check312,check313,check314,check32,check33,check34,check35,check36,check37,check38,check39,check41,check42,check43,check44,extra71,extra710,extra7100,extra711,extra7113,extra7123,extra7125,extra7126,extra7128,extra7129,extra713,extra714,extra7130,extra718,extra719,extra72,extra720,extra721,extra722,extra723,extra724,extra725,extra726,extra727,extra728,extra729,extra731,extra73,extra731,extra735,extra739,extra74,extra741,extra747,extra748,extra75,extra757,extra758,extra759,extra76,extra760,extra761,extra762,extra763,extra764,extra765,extra767,extra768,extra769,extra77,extra771,extra772,extra774,extra776,extra777,extra778,extra78,extra789,extra79,extra790,extra792,extra793,extra794,extra795,extra796,extra798'
# # Category Objective ID Objective Name Prowler check ID Check Summary
@@ -103,7 +102,7 @@ GROUP_CHECKS[18]='check11,check110,check111,check112,check113,check114,check115,
# 83 A.12 Operations Security A.12.4 Logging and Monitoring check39 Ensure a log metric filter and alarm exist for AWS Config configuration changes
# 84 A.12 Operations Security A.12.4 Logging and Monitoring check39 Check if CloudFront distributions have logging enabled
# 85 A.12 Operations Security A.12.4 Logging and Monitoring extra719 Check if Route53 public hosted zones are logging queries to CloudWatch Logs
# 86 A.12 Operations Security A.12.4 Logging and Monitoring extra720 Check if Lambda functions invoke API operations are being recorded by CloudTrail
# 86 A.12 Operations Security A.12.4 Logging and Monitoring extra720 Check if Lambda functions invoke API operations are being recorded by CloudTrail
# 87 A.12 Operations Security A.12.4 Logging and Monitoring extra722 Check if API Gateway has logging enabled
# 88 A.12 Operations Security A.12.4 Logging and Monitoring check38 Ensure a log metric filter and alarm exist for S3 bucket policy changes
# 89 A.12 Operations Security A.12.4 Logging and Monitoring check37 Ensure a log metric filter and alarm exist for disabling or scheduled deletion of customer created CMKs
@@ -136,7 +135,7 @@ GROUP_CHECKS[18]='check11,check110,check111,check112,check113,check114,check115,
#116 A.12 Operations Security A.12.6 Technical Vulnerability Management check23 Ensure the S3 bucket CloudTrail logs to is not publicly accessible
#117 A.12 Operations Security A.12.6 Technical Vulnerability Management extra713 Check if GuardDuty is enabled
#118 A.12 Operations Security A.12.6 Technical Vulnerability Management extra726 Check Trusted Advisor for errors and warnings
#119 A.12 Operations Security A.12.6 Technical Vulnerability Management extra776 Check if ECR image scan found vulnerabilities in the newest image version
#119 A.12 Operations Security A.12.6 Technical Vulnerability Management extra776 Check if ECR image scan found vulnerabilities in the newest image version
#120 A.13 Communications Security A.13.1 Network Security Management check43 Ensure the default security group of every VPC restricts all traffic
#121 A.13 Communications Security A.13.1 Network Security Management check42 Ensure no security groups allow ingress from 0.0.0.0/0 to port 3389
#122 A.13 Communications Security A.13.1 Network Security Management check41 Ensure no security groups allow ingress from 0.0.0.0/0 to port 22

View File

@@ -11,5 +11,4 @@
GROUP_ID[19]='eks-cis'
GROUP_NUMBER[19]='19.0'
GROUP_TITLE[19]='CIS EKS Benchmark - [eks-cis] *********************************'
GROUP_RUN_BY_DEFAULT[19]='N' # run it when execute_all is called
GROUP_CHECKS[19]='extra765,extra794,extra795,extra796,extra797'

View File

@@ -11,5 +11,4 @@
GROUP_ID[1]='group1'
GROUP_NUMBER[1]='1.0'
GROUP_TITLE[1]='Identity and Access Management - CIS only - [group1] ***********'
GROUP_RUN_BY_DEFAULT[1]='Y' # run it when execute_all is called
GROUP_CHECKS[1]='check11,check12,check13,check14,check15,check16,check17,check18,check19,check110,check111,check112,check113,check114,check115,check116,check117,check118,check119,check120,check121,check122'

View File

@@ -14,7 +14,6 @@
GROUP_ID[20]='ffiec'
GROUP_NUMBER[20]='20.0'
GROUP_TITLE[20]='FFIEC Cybersecurity Readiness - ONLY AS REFERENCE - [ffiec] ***'
GROUP_RUN_BY_DEFAULT[20]='N' # run it when execute_all is called
GROUP_CHECKS[20]='check11,check12,check13,check14,check16,check18,check19,check21,check23,check25,check29,check29,check31,check32,check33,check34,check35,check36,check37,check37,check38,check39,check41,check42,check43,check110,check112,check113,check116,check310,check311,check312,check313,check314,extra72,extra76,extra78,extra711,extra723,extra729,extra731,extra734,extra735,extra763,extra792'
# References:

View File

@@ -14,7 +14,6 @@
GROUP_ID[21]='soc2'
GROUP_NUMBER[21]='21.0'
GROUP_TITLE[21]='SOC2 Readiness - ONLY AS REFERENCE - [soc2] *******************'
GROUP_RUN_BY_DEFAULT[21]='N' # run it when execute_all is called
GROUP_CHECKS[21]='check110,check111,check113,check12,check122,check13,check15,check16,check17,check18,check19,check21,check31,check310,check32,check33,check34,check35,check36,check37,check38,check39,check41,check42,check43,extra711,extra72,extra723,extra729,extra731,extra734,extra735,extra739,extra76,extra78,extra792'
# References:

View File

@@ -14,6 +14,4 @@
GROUP_ID[22]='sagemaker'
GROUP_NUMBER[22]='22.0'
GROUP_TITLE[22]='Amazon SageMaker related security checks - [sagemaker] ********'
GROUP_RUN_BY_DEFAULT[22]='N' # run it when execute_all is called
GROUP_CHECKS[22]='extra7103,extra7104,extra7111,extra7112,extra7105,extra7106,extra7107,extra7108,extra7109,extra7110'

View File

@@ -14,7 +14,6 @@
GROUP_ID[23]='ens'
GROUP_NUMBER[23]='23.0'
GROUP_TITLE[23]='ENS Esquema Nacional de Seguridad security checks - [ens] *****'
GROUP_RUN_BY_DEFAULT[23]='N' # run it when execute_all is called
GROUP_CHECKS[23]='extra733,extra7123,check13,check14,check121,extra7100,check120,check116,extra7124,check12,extra7125,check14,check13,check21,check25,extra7127,check35,check24,check31,check36,check32,check33,check34,check22,extra71,check23,check23,check27,check37,extra736,check28,extra713,check21,check29,extra793,extra792,extra764,extra738,check43,extra74,extra710,extra75,check41,check42,extra749,extra750,extra751,extra752,extra753,extra754,extra755,extra7128,extra729,extra761,extra740,extra735,extra734,extra728,extra781,extra773,extra744,extra7126,extra7129'
# ENS Control ID for AWS;Prowler checks that apply

View File

@@ -14,5 +14,4 @@
GROUP_ID[24]='glue'
GROUP_NUMBER[24]='24.0'
GROUP_TITLE[24]='Amazon Glue related security checks - [glue] ******************'
GROUP_RUN_BY_DEFAULT[24]='N' # run it when execute_all is called
GROUP_CHECKS[24]='extra7114,extra7115,extra7116,extra7117,extra7118,extra7119,extra7120,extra7121,extra7122'
GROUP_CHECKS[24]='extra7114,extra7115,extra7116,extra7117,extra7118,extra7119,extra7120,extra7121,extra7122'

View File

@@ -14,7 +14,6 @@
GROUP_ID[25]='ftr'
GROUP_NUMBER[25]='25.0'
GROUP_TITLE[25]='Amazon FTR related security checks - [ftr] ********************'
GROUP_RUN_BY_DEFAULT[25]='N' # run it when execute_all is called
GROUP_CHECKS[25]='check11,check12,check13,check14,check15,check16,check17,check18,check19,check110,check111,check111,check112,check113,check117,check118,check122,check21,check22,extra759,extra760,extra768,extra775,extra797,extra7141,extra73'
# Checks from AWS FTR https://apn-checklists.s3.amazonaws.com/foundational/partner-hosted/partner-hosted/CVLHEC5X7.html
@@ -43,4 +42,3 @@ GROUP_CHECKS[25]='check11,check12,check13,check14,check15,check16,check17,check1
# 7.97 [extra797] Ensure Kubernetes Secrets are encrypted using Customer Master Keys (CMKs) - eks [Medium]
# 7.141 [extra7141] Find secrets in SSM Documents - ssm [Critical]
# 7.3 [extra73] Ensure there are no S3 buckets open to Everyone or Any AWS user - s3 [Critical]

View File

@@ -1,6 +1,6 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2018) by Toni de la Fuente
# Prowler - the handy cloud security tool (copyright 2020) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
@@ -11,10 +11,7 @@
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
# AWS-CLI detector variable
JQ=$(which jq)
if [ -z "${JQ}" ]; then
echo -e "\n$RED ERROR!$NORMAL jq not found. Make sure it is installed correctly and in your \$PATH\n"
EXITCODE=1
exit $EXITCODE
fi
GROUP_ID[26]='ds'
GROUP_NUMBER[26]='26.0'
GROUP_TITLE[26]='Amazon Directory Service related security checks - [ds] *******'
GROUP_CHECKS[26]='extra7181,extra7182,extra7183,extra7184'

View File

@@ -11,5 +11,4 @@
GROUP_ID[2]='group2'
GROUP_NUMBER[2]='2.0'
GROUP_TITLE[2]='Logging - CIS only - [group2] **********************************'
GROUP_RUN_BY_DEFAULT[2]='Y' # run it when execute_all is called
GROUP_CHECKS[2]='check21,check22,check23,check24,check25,check26,check27,check28,check29'

View File

@@ -11,5 +11,4 @@
GROUP_ID[3]='group3'
GROUP_NUMBER[3]='3.0'
GROUP_TITLE[3]='Monitoring - CIS only - [group3] *******************************'
GROUP_RUN_BY_DEFAULT[3]='Y' # run it when execute_all is called
GROUP_CHECKS[3]='check31,check32,check33,check34,check35,check36,check37,check38,check39,check310,check311,check312,check313,check314'

View File

@@ -11,5 +11,4 @@
GROUP_ID[4]='group4'
GROUP_NUMBER[4]='4.0'
GROUP_TITLE[4]='Networking - CIS only - [group4] *******************************'
GROUP_RUN_BY_DEFAULT[4]='Y' # run it when execute_all is called
GROUP_CHECKS[4]='check41,check42,check43,check44,check45,check46'

View File

@@ -11,5 +11,4 @@
GROUP_ID[5]='cislevel1'
GROUP_NUMBER[5]='5.0'
GROUP_TITLE[5]='CIS Level 1 - CIS only - [cislevel1] ***************************'
GROUP_RUN_BY_DEFAULT[5]='N' # run it when execute_all is called
GROUP_CHECKS[5]='check11,check12,check13,check14,check15,check16,check17,check18,check19,check110,check111,check112,check113,check115,check116,check117,check118,check119,check120,check122,check21,check23,check24,check25,check26,check31,check32,check33,check34,check35,check38,check312,check313,check314,check41,check42'

View File

@@ -11,5 +11,4 @@
GROUP_ID[6]='cislevel2'
GROUP_NUMBER[6]='6.0'
GROUP_TITLE[6]='CIS Level 2 - CIS only - [cislevel2] ***************************'
GROUP_RUN_BY_DEFAULT[6]='N' # run it when execute_all is called
GROUP_CHECKS[6]='check11,check12,check13,check14,check15,check16,check17,check18,check19,check110,check111,check112,check113,check114,check115,check116,check117,check118,check119,check120,check121,check122,check21,check22,check23,check24,check25,check26,check27,check28,check29,check31,check32,check33,check34,check35,check36,check37,check38,check39,check310,check311,check312,check313,check314,check41,check42,check43,check44'

View File

@@ -14,8 +14,7 @@
GROUP_ID[7]='extras'
GROUP_NUMBER[7]='7.0'
GROUP_TITLE[7]='Extras - all non CIS specific checks - [extras] ****************'
GROUP_RUN_BY_DEFAULT[7]='Y' # run it when execute_all is called
GROUP_CHECKS[7]='extra71,extra72,extra73,extra74,extra75,extra76,extra77,extra78,extra79,extra710,extra711,extra712,extra713,extra714,extra715,extra716,extra717,extra718,extra719,extra720,extra721,extra722,extra723,extra724,extra725,extra726,extra727,extra728,extra729,extra730,extra731,extra732,extra733,extra734,extra735,extra736,extra738,extra739,extra740,extra741,extra742,extra743,extra744,extra745,extra746,extra747,extra748,extra749,extra750,extra751,extra752,extra753,extra754,extra755,extra757,extra758,extra761,extra762,extra763,extra764,extra765,extra767,extra768,extra769,extra770,extra771,extra772,extra773,extra774,extra775,extra776,extra777,extra778,extra779,extra780,extra781,extra782,extra783,extra784,extra785,extra786,extra787,extra788,extra791,extra792,extra793,extra794,extra795,extra796,extra797,extra798,extra799,extra7100,extra7101,extra7102,extra7103,extra7104,extra7105,extra7106,extra7107,extra7108,extra7109,extra7110,extra7111,extra7112,extra7113,extra7114,extra7115,extra7116,extra7117,extra7118,extra7119,extra7120,extra7121,extra7122,extra7123,extra7124,extra7125,extra7126,extra7127,extra7128,extra7129,extra7130,extra7131,extra7132,extra7133,extra7134,extra7135,extra7136,extra7137,extra7138,extra7139,extra7140,extra7141,extra7142,extra7143,extra7144,extra7145,extra7146,extra7147,extra7148,extra7149,extra7150,extra7151,extra7152,extra7153,extra7154,extra7155,extra7156,extra7157,extra7158,extra7159,extra7160,extra7161,extra7162,extra7163,extra7164,extra7165,extra7166,extra7167,extra7168,extra7169,extra7170,extra7171,extra7172,extra7173,extra7174,extra7175,extra7176,extra7177,extra7178,extra7179,extra7180'
GROUP_CHECKS[7]='extra71,extra72,extra73,extra74,extra75,extra76,extra77,extra78,extra79,extra710,extra711,extra712,extra713,extra714,extra715,extra716,extra717,extra718,extra719,extra720,extra721,extra722,extra723,extra724,extra725,extra726,extra727,extra728,extra729,extra730,extra731,extra732,extra733,extra734,extra735,extra736,extra738,extra739,extra740,extra741,extra742,extra743,extra744,extra745,extra746,extra747,extra748,extra749,extra750,extra751,extra752,extra753,extra754,extra755,extra757,extra758,extra761,extra762,extra763,extra764,extra765,extra767,extra768,extra769,extra770,extra771,extra772,extra773,extra774,extra775,extra776,extra777,extra778,extra779,extra780,extra781,extra782,extra783,extra784,extra785,extra786,extra787,extra788,extra791,extra792,extra793,extra794,extra795,extra796,extra797,extra798,extra799,extra7100,extra7101,extra7102,extra7103,extra7104,extra7105,extra7106,extra7107,extra7108,extra7109,extra7110,extra7111,extra7112,extra7113,extra7114,extra7115,extra7116,extra7117,extra7118,extra7119,extra7120,extra7121,extra7122,extra7123,extra7124,extra7125,extra7126,extra7127,extra7128,extra7129,extra7130,extra7131,extra7132,extra7133,extra7134,extra7135,extra7136,extra7137,extra7138,extra7139,extra7140,extra7141,extra7142,extra7143,extra7144,extra7145,extra7146,extra7147,extra7148,extra7149,extra7150,extra7151,extra7152,extra7153,extra7154,extra7155,extra7156,extra7157,extra7158,extra7159,extra7160,extra7161,extra7162,extra7163,extra7164,extra7165,extra7166,extra7167,extra7168,extra7169,extra7170,extra7171,extra7172,extra7173,extra7174,extra7175,extra7176,extra7177,extra7178,extra7179,extra7180,extra7181,extra7182,extra7183,extra7184,extra7185,extra7186,extra7187,extra7188,extra7189,extra7190,extra7191,extra7192,extra7193'
# Extras 759 and 760 (lambda variables and code secrets finder are not included)
# to run detect-secrets use `./prowler -g secrets`

View File

@@ -14,5 +14,4 @@
GROUP_ID[8]='forensics-ready'
GROUP_NUMBER[8]='8.0'
GROUP_TITLE[8]='Forensics Readiness - [forensics-ready] ************************'
GROUP_RUN_BY_DEFAULT[8]='N' # run it when execute_all is called
GROUP_CHECKS[8]='check21,check22,check23,check24,check25,check26,check27,check29,extra712,extra713,extra714,extra715,extra717,extra718,extra719,extra720,extra721,extra722,extra725,extra7101,extra794'

View File

@@ -14,7 +14,6 @@
GROUP_ID[9]='gdpr'
GROUP_NUMBER[9]='9.0'
GROUP_TITLE[9]='GDPR Readiness - ONLY AS REFERENCE - [gdpr] ********************'
GROUP_RUN_BY_DEFAULT[9]='N' # run it when execute_all is called
GROUP_CHECKS[9]='extra718,extra725,extra727,check12,check113,check114,extra71,extra731,extra732,extra733,check25,check39,check21,check22,check23,check24,check26,check27,check35,extra726,extra714,extra715,extra717,extra719,extra720,extra721,extra722,check43,check25,extra714,extra729,extra734,extra735,extra736,extra738,extra740,extra761,check11,check110,check111,check112,check116,check120,check122,check13,check14,check15,check16,check17,check18,check19,check28,check29,check31,check310,check311,check312,check313,check314,check32,check33,check34,check36,check37,check38,check41,check42,extra711,extra72,extra723,extra730,extra739,extra76,extra763,extra778,extra78,extra792,extra798'
# Resources:

View File

@@ -14,5 +14,4 @@
GROUP_ID[9]='my-custom-group'
GROUP_NUMBER[9]='9.0'
GROUP_TITLE[9]='My Custom Group - [my-custom-group] ****************************'
GROUP_RUN_BY_DEFAULT[9]='N' # run it when execute_all is called
GROUP_CHECKS[9]='checkNN,checkMM'

View File

@@ -3,7 +3,9 @@
"Statement": [
{
"Action": [
"ds:ListAuthorizedApplications",
"ds:Get*",
"ds:Describe*",
"ds:List*",
"ec2:GetEbsEncryptionByDefault",
"ecr:Describe*",
"elasticfilesystem:DescribeBackupPolicy",
@@ -21,6 +23,15 @@
"Resource": "*",
"Effect": "Allow",
"Sid": "AllowMoreReadForProwler"
},
{
"Effect": "Allow",
"Action": [
"apigateway:GET"
],
"Resource": [
"arn:aws:apigateway:*::/restapis/*"
]
}
]
}

View File

@@ -18,7 +18,7 @@ assume_role(){
# If profile is not defined, restore original credentials from environment variables, if they exists!
restoreInitialAWSCredentials
fi
# Both variables are mandatory to be set together
if [[ -z $ROLE_TO_ASSUME || -z $ACCOUNT_TO_ASSUME ]]; then
echo "$OPTRED ERROR!$OPTNORMAL - Both Account ID (-A) and IAM Role to assume (-R) must be set"
@@ -29,7 +29,7 @@ assume_role(){
if [[ -z $SESSION_DURATION_TO_ASSUME ]]; then
SESSION_DURATION_TO_ASSUME="3600"
elif [[ "${SESSION_DURATION_TO_ASSUME}" -gt "43200" ]] || [[ "${SESSION_DURATION_TO_ASSUME}" -lt "900" ]]; then
echo "$OPTRED ERROR!$OPTNORMAL - Role session duration must be more than 900 seconds and less than 4300 seconds"
echo "$OPTRED ERROR!$OPTNORMAL - Role session duration must be more than 900 seconds and less than 43200 seconds"
exit 1
fi
@@ -62,12 +62,12 @@ assume_role(){
EXITCODE=1
exit $EXITCODE
fi
# echo FILE WITH TEMP CREDS: $TEMP_STS_ASSUMED_FILE
# The profile shouldn't be used for CLI
PROFILE=""
PROFILE_OPT=""
PROFILE_OPT=""
# Set AWS environment variables with assumed role credentials
ASSUME_AWS_ACCESS_KEY_ID=$(jq -r '.Credentials.AccessKeyId' "${TEMP_STS_ASSUMED_FILE}")
@@ -78,10 +78,6 @@ assume_role(){
export AWS_SESSION_TOKEN=$ASSUME_AWS_SESSION_TOKEN
ASSUME_AWS_SESSION_EXPIRATION=$(jq -r '.Credentials.Expiration | sub("\\+00:00";"Z") | fromdateiso8601' "${TEMP_STS_ASSUMED_FILE}")
export AWS_SESSION_EXPIRATION=$ASSUME_AWS_SESSION_EXPIRATION
# echo TEMP AWS_ACCESS_KEY_ID: $ASSUME_AWS_ACCESS_KEY_ID
# echo TEMP AWS_SECRET_ACCESS_KEY: $ASSUME_AWS_SECRET_ACCESS_KEY
# echo TEMP AWS_SESSION_TOKEN: $ASSUME_AWS_SESSION_TOKEN
# echo EXPIRATION EPOCH TIME: $ASSUME_AWS_SESSION_EXPIRATION
cleanSTSAssumeFile
}
@@ -92,11 +88,12 @@ cleanSTSAssumeFile() {
}
backupInitialAWSCredentials() {
if [[ $(printenv AWS_ACCESS_KEY_ID) && $(printenv AWS_SECRET_ACCESS_KEY) && $(printenv AWS_SESSION_TOKEN) ]]; then
if [[ $(printenv AWS_ACCESS_KEY_ID) && $(printenv AWS_SECRET_ACCESS_KEY) && $(printenv AWS_SESSION_TOKEN) ]]
then
INITIAL_AWS_ACCESS_KEY_ID=$(printenv AWS_ACCESS_KEY_ID)
INITIAL_AWS_SECRET_ACCESS_KEY=$(printenv AWS_SECRET_ACCESS_KEY)
INITIAL_AWS_SESSION_TOKEN=$(printenv AWS_SESSION_TOKEN)
fi
fi
}
restoreInitialAWSCredentials() {

View File

@@ -16,43 +16,74 @@
# check environment variables and if not, it checks and loads credentials from
# instance profile (metadata server) if runs in an EC2 instance
INSTANCE_PROFILE=$(curl -s -m 1 http://169.254.169.254/latest/meta-data/iam/security-credentials/)
if echo "$INSTANCE_PROFILE" | grep -q '404 - Not Found'; then
INSTANCE_PROFILE=
fi
aws_profile_loader() {
INSTANCE_PROFILE=$(curl -s -m 1 http://169.254.169.254/latest/meta-data/iam/security-credentials/)
if echo "$INSTANCE_PROFILE" | grep -q '404 - Not Found'; then
INSTANCE_PROFILE=
fi
if [[ $PROFILE ]]; then
PROFILE_OPT="--profile $PROFILE"
elif [[ $AWS_ACCESS_KEY_ID && $AWS_SECRET_ACCESS_KEY || $AWS_SESSION_TOKEN || $AWS_PROFILE ]];then
PROFILE="$AWS_PROFILE"
PROFILE_OPT=""
elif [[ -n $AWS_CONTAINER_CREDENTIALS_RELATIVE_URI ]] && [[ -z $INSTANCE_PROFILE ]]; then
PROFILE="INSTANCE-PROFILE"
AWS_ACCESS_KEY_ID=$(curl -s 169.254.170.2$AWS_CONTAINER_CREDENTIALS_RELATIVE_URI | grep AccessKeyId | cut -d':' -f2 | sed 's/[^0-9A-Z]*//g')
AWS_SECRET_ACCESS_KEY_ID=$(curl -s 169.254.170.2$AWS_CONTAINER_CREDENTIALS_RELATIVE_URI | grep SecretAccessKey | cut -d':' -f2 | sed 's/[^0-9A-Za-z/+=]*//g')
AWS_SESSION_TOKEN=$(curl -s 169.254.170.2$AWS_CONTAINER_CREDENTIALS_RELATIVE_URI grep Token| cut -d':' -f2 | sed 's/[^0-9A-Za-z/+=]*//g')
elif [[ $AWS_WEB_IDENTITY_TOKEN_FILE ]]; then
PROFILE=""
PROFILE_OPT=""
elif [[ $INSTANCE_PROFILE ]]; then
PROFILE="INSTANCE-PROFILE"
AWS_ACCESS_KEY_ID=$(curl -s http://169.254.169.254/latest/meta-data/iam/security-credentials/${INSTANCE_PROFILE} | grep AccessKeyId | cut -d':' -f2 | sed 's/[^0-9A-Z]*//g')
AWS_SECRET_ACCESS_KEY_ID=$(curl -s http://169.254.169.254/latest/meta-data/iam/security-credentials/${INSTANCE_PROFILE} | grep SecretAccessKey | cut -d':' -f2 | sed 's/[^0-9A-Za-z/+=]*//g')
AWS_SESSION_TOKEN=$(curl -s http://169.254.169.254/latest/meta-data/iam/security-credentials/${INSTANCE_PROFILE} grep Token| cut -d':' -f2 | sed 's/[^0-9A-Za-z/+=]*//g')
elif [[ $AWS_EXECUTION_ENV == "CloudShell" ]]; then
PROFILE_OPT=""
else
PROFILE="default"
if [[ $PROFILE ]]; then
PROFILE_OPT="--profile $PROFILE"
fi
# Backing up $PROFILE_OPT needed to renew assume_role
PROFILE_OPT_BAK=$PROFILE_OPT
# Set default region by aws config, fall back to us-east-1
REGION_CONFIG=$(aws configure get region)
if [[ $REGION_OPT ]]; then
REGION="$REGION_OPT"
elif [[ $REGION_CONFIG ]]; then
REGION="$REGION_CONFIG"
else
REGION="us-east-1"
fi
elif [[ $AWS_ACCESS_KEY_ID && $AWS_SECRET_ACCESS_KEY || $AWS_SESSION_TOKEN || $AWS_PROFILE ]];then
PROFILE="$AWS_PROFILE"
PROFILE_OPT=""
# AWS_CONTAINER_CREDENTIALS_RELATIVE_URI is set by default https://docs.aws.amazon.com/sdkref/latest/guide/feature-container-credentials.html
elif [[ -n $AWS_CONTAINER_CREDENTIALS_RELATIVE_URI ]] && [[ -z $INSTANCE_PROFILE ]]
then
PROFILE="INSTANCE-PROFILE"
set_metadata_credentials "169.254.170.2${AWS_CONTAINER_CREDENTIALS_RELATIVE_URI}"
elif [[ $AWS_WEB_IDENTITY_TOKEN_FILE ]]; then
PROFILE=""
PROFILE_OPT=""
elif [[ $INSTANCE_PROFILE ]]
then
PROFILE="INSTANCE-PROFILE"
set_metadata_credentials "http://169.254.169.254/latest/meta-data/iam/security-credentials/${INSTANCE_PROFILE}"
elif [[ $AWS_EXECUTION_ENV == "CloudShell" ]]
then
PROFILE_OPT=""
else
PROFILE="default"
PROFILE_OPT="--profile $PROFILE"
fi
# Backing up $PROFILE_OPT needed to renew assume_role
PROFILE_OPT_BAK=$PROFILE_OPT
# Set default region by aws config, fall back to us-east-1
REGION_CONFIG=$(aws configure get region)
if [[ $REGION_OPT ]]; then
REGION="$REGION_OPT"
elif [[ $REGION_CONFIG ]]; then
REGION="$REGION_CONFIG"
else
REGION="us-east-1"
fi
}
# Function to get all regions
get_regions() {
# Get list of regions based on include/whoami
if ! REGIONS=$($AWSCLI ec2 describe-regions \
--query 'Regions[].RegionName' \
--output text ${PROFILE_OPT} \
--region "${REGION_FOR_STS}" \
--region-names ${FILTERREGION//[,]/ } 2>&1)
then
echo "$OPTRED Access Denied trying to describe regions! Review permissions as described here: https://github.com/prowler-cloud/prowler/#requirements-and-installation $OPTNORMAL"
EXITCODE=1
exit $EXITCODE
fi
}
# Function to set metadata credentials
set_metadata_credentials() {
INSTANCE_METADATA_URI="${1}"
INSTANCE_METADATA=$(curl -s "${INSTANCE_METADATA_URI}")
AWS_ACCESS_KEY_ID=$(jq -r '.AccessKeyId' <<< "${INSTANCE_METADATA}")
export AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY=$(jq -r '.SecretAccessKey' <<< "${INSTANCE_METADATA}")
export AWS_SECRET_ACCESS_KEY
AWS_SESSION_TOKEN=$(jq -r '.Token' <<< "${INSTANCE_METADATA}")
export AWS_SESSION_TOKEN
ASSUME_AWS_SESSION_EXPIRATION=$(jq -r '.Expiration | sub("\\+00:00";"Z") | fromdateiso8601' <<< "${INSTANCE_METADATA}")
export AWS_SESSION_EXPIRATION=$ASSUME_AWS_SESSION_EXPIRATION
}

View File

@@ -12,12 +12,25 @@
# specific language governing permissions and limitations under the License.
# AWS-CLI detector variable
if [ ! -z $(which aws) ]; then
AWSCLI=$(which aws)
elif [ ! -z $(type -p aws) ]; then
AWSCLI=$(type -p aws)
else
echo -e "\n$RED ERROR!$NORMAL AWS-CLI (aws command) not found. Make sure it is installed correctly and in your \$PATH\n"
EXITCODE=1
exit $EXITCODE
fi
aws_cli_detector() {
if [ ! -z $(which aws) ]; then
AWSCLI=$(which aws)
elif [ ! -z $(type -p aws) ]; then
AWSCLI=$(type -p aws)
else
echo -e "\n$RED ERROR!$NORMAL AWS-CLI (aws command) not found. Make sure it is installed correctly and in your \$PATH\n"
EXITCODE=1
exit $EXITCODE
fi
}
set_aws_default_output() {
# Ensures command output will always be set to JSON.
# If the default value is already set, ORIGINAL_OUTPUT will be used to store it and reset it at cleanup
if [[ -z "${AWS_DEFAULT_OUTPUT}" ]]; then
ORIGINAL_OUTPUT=$AWS_DEFAULT_OUTPUT
export AWS_DEFAULT_OUTPUT="json"
else
export AWS_DEFAULT_OUTPUT="json"
fi
}

View File

@@ -19,7 +19,7 @@ prowlerBanner() {
echo -e " | |_) | | | (_) \ V V /| | __/ |"
echo -e " | .__/|_| \___/ \_/\_/ |_|\___|_|v$PROWLER_VERSION"
echo -e " |_|$NORMAL$BLUE the handy cloud security tool$NORMAL\n"
echo -e "$YELLOW Date: $(date)"
echo -e "$YELLOW Date: $(date)$NORMAL"
printColorsCode
fi
}

View File

@@ -11,62 +11,40 @@
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
IFS=',' read -ra MODES <<< "${MODE}"
for MODE in "${MODES[@]}"; do
if [[ "$MODE" != "mono" && "$MODE" != "text" && "$MODE" != "csv" && "$MODE" != "json" && "$MODE" != "json-asff" && "$MODE" != "junit-xml" && "$MODE" != "html" ]]; then
echo -e "${OPTRED}ERROR!$OPTNORMAL Invalid output mode. Choose text, mono, csv, json, json-asff, junit-xml or html. ./prowler -h for help"
EXITCODE=1
exit $EXITCODE
# Colors
# NOTE: Your editor may NOT show the 0x1b / escape character left of the '['
set_colors() {
if [[ "${MODE}" =~ "mono" ]]
then
NORMAL=""
WARNING=""
NOTICE=""
OK=""
BAD=""
CYAN=""
BLUE=""
BROWN=""
MAGENTA=""
RED=""
YELLOW=""
else
NORMAL=""
WARNING="" # Warning (brown)
NOTICE="" # Notice (yellow)
OK="" # Ok (green)
BAD="" # Bad (red)
CYAN=""
BLUE=""
BROWN=""
MAGENTA=""
RED=""
YELLOW=""
fi
if [[ "$MODE" == "mono" || "$MODE" == "csv" || "$MODE" == "json" || "$MODE" == "json-asff" ]]; then
MONOCHROME=1
fi
done
if [[ $MONOCHROME -eq 1 ]]; then
# Colors
NORMAL=''
WARNING='' # Bad (red)
SECTION='' # Section (yellow)
NOTICE='' # Notice (yellow)
OK='' # Ok (green)
BAD='' # Bad (red)
CYAN=''
BLUE=''
BROWN=''
DARKGRAY=''
GRAY=''
GREEN=''
MAGENTA=''
PURPLE=''
RED=''
YELLOW=''
WHITE=''
else
# Colors
# NOTE: Your editor may NOT show the 0x1b / escape character left of the '['
NORMAL=""
WARNING="" # Warning (brown)
SECTION="" # Section (yellow)
NOTICE="" # Notice (yellow)
OK="" # Ok (green)
BAD="" # Bad (red)
CYAN=""
BLUE=""
BROWN=""
DARKGRAY=""
GRAY=""
GREEN=""
MAGENTA=""
PURPLE=""
RED=""
YELLOW=""
WHITE=""
fi
}
printColorsCode(){
if [[ $MONOCHROME -eq 0 ]]; then
if [[ ! "$MODE" =~ "mono" ]]
then
echo -e "\n$NORMAL Color code for results: "
echo -e " - $NOTICE INFO (Information)$NORMAL"
echo -e " - $OK PASS (Recommended value)$NORMAL"

View File

@@ -37,14 +37,3 @@ saveReport(){
textInfo "IAM Credential Report saved in $TEMP_REPORT_FILE"
fi
}
# Delete temporary report file
cleanTemp(){
if [[ $KEEPCREDREPORT -ne 1 ]]; then
rm -fr $TEMP_REPORT_FILE
fi
cleanSTSAssumeFile
}
# Delete the temporary report file if we get interrupted/terminated
trap cleanTemp EXIT

View File

@@ -14,39 +14,39 @@
custom_checks(){
# check if the path is an S3 URI
if grep -q -E "^s3://([^/]+)/?(.*?([^/]+)/?)?$" <<< "$EXTERNAL_CHECKS_PATH"; then
if grep -q "check*" <<< "$("${AWSCLI}" s3 ls "${EXTERNAL_CHECKS_PATH}" $PROFILE_OPT)"; then
if grep -q "check*" <<< "$("${AWSCLI}" s3 ls "${EXTERNAL_CHECKS_PATH}" "${PROFILE_OPT}")"; then
# download s3 object
echo -e "$NOTICE Downloading custom checks from S3 URI $EXTERNAL_CHECKS_PATH...$NORMAL"
S3_CHECKS_TEMP_FOLDER="$PROWLER_DIR/s3-custom-checks"
echo -e "${NOTICE} Downloading custom checks from S3 URI ${EXTERNAL_CHECKS_PATH}...${NORMAL}"
S3_CHECKS_TEMP_FOLDER="${PROWLER_DIR}/s3-custom-checks"
mkdir "${S3_CHECKS_TEMP_FOLDER}"
$AWSCLI s3 sync "$EXTERNAL_CHECKS_PATH" "${S3_CHECKS_TEMP_FOLDER}" $PROFILE_OPT > /dev/null
${AWSCLI} s3 sync "${EXTERNAL_CHECKS_PATH}" "${S3_CHECKS_TEMP_FOLDER}" "${PROFILE_OPT}" > /dev/null
# verify if there are checks
for checks in "${S3_CHECKS_TEMP_FOLDER}"/check*; do
. "$checks"
echo -e "$OK Check $(basename "$checks") was included!$NORMAL"
for CHECKS in "${S3_CHECKS_TEMP_FOLDER}"/check*; do
. "${CHECKS}"
echo -e "${OK} Check $(basename "${CHECKS}") was included!${NORMAL}"
done
echo -e "$OK Success! Custom checks were downloaded and included, starting Prowler...$NORMAL"
echo -e "${OK} Success! Custom checks were downloaded and included, starting Prowler...${NORMAL}"
# remove temporary dir
rm -rf "${S3_CHECKS_TEMP_FOLDER}"
else
echo "$BAD FAIL! Access Denied trying to download custom checks or $EXTERNAL_CHECKS_PATH does not contain any checks, please make sure it is correct and/or you have permissions to get the S3 objects.$NORMAL"
echo "${BAD} FAIL! Access Denied trying to download custom checks or ${EXTERNAL_CHECKS_PATH} does not contain any checks, please make sure it is correct and/or you have permissions to get the S3 objects.${NORMAL}"
EXITCODE=1
# remove temporary dir
rm -rf "${S3_CHECKS_TEMP_FOLDER}"
exit $EXITCODE
exit "${EXITCODE}"
fi
else
# verify if input directory exists with checks
if ls "${EXTERNAL_CHECKS_PATH}"/check* > /dev/null 2>&1; then
for checks in "${EXTERNAL_CHECKS_PATH}"/check*; do
. "$checks"
echo -e "$OK Check $(basename "$checks") was included!$NORMAL"
for CHECKS in "${EXTERNAL_CHECKS_PATH}"/check*; do
. "${CHECKS}"
echo -e "${OK} Check $(basename "${CHECKS}") was included!${NORMAL}"
done
echo -e "$OK Success! Custom checks were included, starting Prowler...$NORMAL"
echo -e "${OK} Success! Custom checks were included, starting Prowler...${NORMAL}"
else
echo "$BAD FAIL! $EXTERNAL_CHECKS_PATH does not exist or not contain checks, please input a valid custom checks path.$NORMAL"
echo "${BAD} FAIL! ${EXTERNAL_CHECKS_PATH} does not exist or not contain checks, please input a valid custom checks path.${NORMAL}"
EXITCODE=1
exit $EXITCODE
exit "${EXITCODE}"
fi
fi
}

39
include/db_connector Executable file
View File

@@ -0,0 +1,39 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2018) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
# DB connectors
# Supported connectors list is stored in SUPPORTED_DB_PROVIDERS
# Once the logic of the connector is added it is needed to add it to the list also
# Example: adding oracle to the list
# SUPPORTED_DB_PROVIDERS="postgresql|oracle"
# oracle_connector() { ... }
SUPPORTED_DB_PROVIDERS="postgresql"
export SUPPORTED_DB_PROVIDERS
# Connectors for different databases
#POSTGRESQL connector
postgresql_connector () {
CSV_REGISTRY="${1}"
psql -q -U "${POSTGRES_USER}" -h "${POSTGRES_HOST}" -d "${POSTGRES_DB}" -c "copy ${POSTGRES_TABLE} from stdin with null as E'\'\'' delimiter ','" <<< "${CSV_REGISTRY}"
}
db_exit_abnormally() {
DB_PROVIDER="${1}"
MESSAGE=${2}
echo "${OPTRED}ERROR!$OPTNORMAL Database connector for '${DB_PROVIDER}' failed. ${MESSAGE}"
exit 2
}

56
include/default_variables Normal file
View File

@@ -0,0 +1,56 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2018) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
# Output Colors
OPTRED=""
export OPTRED
OPTNORMAL=""
export OPTNORMAL
# AWS Region
REGION=""
export REGION
FILTERREGION=""
export FILTERREGION
# Max AWS CLI items
MAXITEMS=100
export MAXITEMS
# Default output mode
MODE="text"
export MODE
# Security Hub
SEND_TO_SECURITY_HUB=0
export SEND_TO_SECURITY_HUB
# Date & Time
TZ=UTC
export TZ
PROWLER_START_TIME=$( date -u +"%Y-%m-%dT%H:%M:%S%z" )
export PROWLER_START_TIME
OUTPUT_DATE=$(date -u +"%Y%m%d%H%M%S")
export OUTPUT_DATE
# Default Prowler Options
QUIET=0
export QUIET
SEP=','
export SEP
KEEPCREDREPORT=0
export KEEPCREDREPORT
EXITCODE=0
export EXITCODE
FAILED_CHECK_FAILED_SCAN=1
export FAILED_CHECK_FAILED_SCAN

139
include/execute_check Normal file
View File

@@ -0,0 +1,139 @@
#!/usr/bin/env bash
# Copyright 2018 Toni de la Fuente
# Prowler is a tool that provides automate auditing and hardening guidance of an
# AWS account. It is based on AWS-CLI commands. It follows some guidelines
# present in the CIS Amazon Web Services Foundations Benchmark at:
# https://d0.awsstatic.com/whitepapers/compliance/AWS_CIS_Foundations_Benchmark.pdf
# Contact the author at https://blyx.com/contact
# and open issues or ask questions at https://github.com/prowler-cloud/prowler
# Code is licensed as Apache License 2.0 as specified in
# each file. You may obtain a copy of the License at
# http://www.apache.org/licenses/LICENSE-2.0
# Function to execute the check
execute_check() {
if [[ -n "${ACCOUNT_TO_ASSUME}" || -n "${ROLE_TO_ASSUME}" || "${PROFILE}" == "INSTANCE-PROFILE" ]]; then
# echo ******* I am here again to check on my role *******
# Following logic looks for time remaining in the session and review it
# if it is less than 600 seconds, 10 minutes.
CURRENT_TIMESTAMP=$(date -u "+%s")
SESSION_TIME_REMAINING=$(expr "${AWS_SESSION_EXPIRATION}" - "${CURRENT_TIMESTAMP}")
MINIMUM_REMAINING_TIME_ALLOWED=600
if (( "${MINIMUM_REMAINING_TIME_ALLOWED}" > "${SESSION_TIME_REMAINING}" )); then
# echo LESS THAN 10 MIN LEFT: RE-ASSUMING...
unset AWS_ACCESS_KEY_ID
unset AWS_SECRET_ACCESS_KEY
unset AWS_SESSION_TOKEN
if [[ "${PROFILE}" == "INSTANCE-PROFILE" ]]; then
if [[ -n "${AWS_CONTAINER_CREDENTIALS_RELATIVE_URI}" ]]; then
INSTANCE_METADATA_URI="169.254.170.2${AWS_CONTAINER_CREDENTIALS_RELATIVE_URI}"
else
INSTANCE_METADATA_URI="http://169.254.169.254/latest/meta-data/iam/security-credentials/${INSTANCE_PROFILE}"
fi
set_metadata_credentials "${INSTANCE_METADATA_URI}"
else
assume_role
fi
fi
fi
CHECK_ID="$1"
# See if this is an alternate name for a check
# for example, we might have been passed 1.01 which is another name for 1.1
local alternate_name_var=CHECK_ALTERNATE_$1
local alternate_name=${!alternate_name_var}
# See if this check defines an ASFF Type, if so, use this, falling back to a sane default
# For a list of Types Taxonomy, see: https://docs.aws.amazon.com/securityhub/latest/userguide/securityhub-findings-format-type-taxonomy.html
local asff_type_var=CHECK_ASFF_TYPE_$1
CHECK_ASFF_TYPE="${!asff_type_var:-Software and Configuration Checks}"
local asff_compliance_type_var=CHECK_ASFF_COMPLIANCE_TYPE_$1
CHECK_ASFF_COMPLIANCE_TYPE="${!asff_compliance_type_var:-Software and Configuration Checks}"
# See if this check defines an ASFF Resource Type, if so, use this, falling back to a sane default
# For a list of Resource Types, see: https://docs.aws.amazon.com/securityhub/latest/userguide/securityhub-findings-format.html#asff-resources
local asff_resource_type_var=CHECK_ASFF_RESOURCE_TYPE_$1
CHECK_ASFF_RESOURCE_TYPE="${!asff_resource_type_var:-AwsAccount}"
local severity_var=CHECK_SEVERITY_$1
CHECK_SEVERITY="${!severity_var}"
local servicename_var=CHECK_SERVICENAME_$1
CHECK_SERVICENAME="${!servicename_var}"
local risk_var=CHECK_RISK_$1
CHECK_RISK="${!risk_var}"
local remediation_var=CHECK_REMEDIATION_$1
CHECK_REMEDIATION="${!remediation_var}"
local doc_var=CHECK_DOC_$1
CHECK_DOC="${!doc_var}"
local caf_epic_var=CHECK_CAF_EPIC_$1
CHECK_CAF_EPIC="${!caf_epic_var}"
SECURITYHUB_NEW_FINDINGS_IDS=()
# Generate the credential report, only if it is group1 related which checks we
# run so that the checks can safely assume it's available
# set the custom ignores list for this check
ignores="$(awk "/${1}/{print}" <(echo "${ALLOWLIST}"))"
if [ ${alternate_name} ];then
if [[ ${alternate_name} == check1* || ${alternate_name} == extra71 || ${alternate_name} == extra774 || ${alternate_name} == extra7123 ]];then
if [ ! -s $TEMP_REPORT_FILE ];then
genCredReport
saveReport
fi
fi
show_check_title ${alternate_name}
if [[ " ${MODE} " =~ "junit-xml" ]]; then
prepare_junit_check_output "$1"
fi
# Execute the check
IGNORES="${ignores}" CHECK_NAME="$1" ${alternate_name}
if [[ " ${MODE} " =~ "junit-xml" ]]; then
finalise_junit_check_output "$1"
fi
if [[ "$SEND_TO_SECURITY_HUB" -eq 1 ]]; then
resolveSecurityHubPreviousFails "$1"
fi
else
# Check to see if this is a real check
local check_id_var=CHECK_ID_$1
local check_id=${!check_id_var}
if [ ${check_id} ]; then
if [[ ${check_id} == 1* || ${check_id} == 7.1 || ${check_id} == 7.74 || ${check_id} == 7.123 ]];then
if [ ! -s $TEMP_REPORT_FILE ];then
genCredReport
saveReport
fi
fi
show_check_title "$1"
if [[ " ${MODE} " =~ "junit-xml" ]]; then
prepare_junit_check_output "$1"
fi
# Execute the check
IGNORES="${ignores}" CHECK_NAME="$1" $1
if [[ " ${MODE} " =~ "junit-xml" ]]; then
finalise_junit_check_output "$1"
fi
if [[ "$SEND_TO_SECURITY_HUB" -eq 1 ]]; then
resolveSecurityHubPreviousFails "$1"
fi
else
textFail "Check ${CHECK_ID} does not exist. Use a valid check name (i.e. check41 or extra71)";
exit $EXITCODE
fi
fi
}

View File

@@ -18,7 +18,7 @@ addHtmlHeader() {
PROFILE="ENV"
fi
if [[ -z $HTML_REPORT_INIT ]]; then
cat <<EOF
cat <<EOF >> "${OUTPUT_FILE_NAME}"."${EXTENSION_HTML}"
<!DOCTYPE html>
<html lang="en">
<head>

View File

@@ -20,14 +20,6 @@ JUNIT_FAILURES_COUNT="0"
JUNIT_SKIPPED_COUNT="0"
JUNIT_ERRORS_COUNT="0"
is_junit_output_enabled() {
if [[ " ${MODES[@]} " =~ " junit-xml " ]]; then
true
else
false
fi
}
xml_escape() {
sed 's/&/\&amp;/g; s/</\&lt;/g; s/>/\&gt;/g; s/\"/\&quot;/g; s/'"'"'/\&#39;/g' <<< "$1"
}

137
include/loader Normal file
View File

@@ -0,0 +1,137 @@
#!/usr/bin/env bash
# Copyright 2018 Toni de la Fuente
# Prowler is a tool that provides automate auditing and hardening guidance of an
# AWS account. It is based on AWS-CLI commands. It follows some guidelines
# present in the CIS Amazon Web Services Foundations Benchmark at:
# https://d0.awsstatic.com/whitepapers/compliance/AWS_CIS_Foundations_Benchmark.pdf
# Contact the author at https://blyx.com/contact
# and open issues or ask questions at https://github.com/prowler-cloud/prowler
# Code is licensed as Apache License 2.0 as specified in
# each file. You may obtain a copy of the License at
# http://www.apache.org/licenses/LICENSE-2.0
# Load all of the groups of checks inside groups folder named as "groupNumber*"
load_groups() {
for group in "${PROWLER_DIR}"/groups/group[0-9]*; do
# shellcheck source=/dev/null
. "${group}"
done
}
# Load a single check
load_check(){
CHECK=${1}
# shellcheck source=/dev/null
. "$PROWLER_DIR/checks/${CHECK}" 2>/dev/null
}
get_checks() {
# Parses the check file into CHECK_ID's.
if [[ -n "$CHECK_FILE" ]]
then
if [[ -f $CHECK_FILE ]]
then
# Parses the file, converting it to a comma seperated list. Ignores all # comments and removes extra blank spaces
# REVIEW THIS OPTION -C
CHECKS_FROM_FILE="$(awk '!/^[[:space:]]*#/{print }' <(cat ${CHECK_FILE} | sed 's/[[:space:]]*#.*$//g;/^$/d' | sed 'H;1h;$!d;x;y/\n/,/' | tr -d ' '))"
IFS=',' read -ra TOTAL_CHECKS <<< "${CHECKS_FROM_FILE}"
else
# If the file doesn't exist, exits Prowler
echo "$CHECK_FILE does not exist"
EXITCODE=1
exit $EXITCODE
fi
# If '-g <group_id>' has been specified, only show the titles of checks within the specified group
# reading groups from GROUP_ID_READ
elif [[ "${GROUP_ID_READ}" ]]
then
for GROUP in "${GROUP_ID[@]}"
do
if [[ "${GROUP}" == "${GROUP_ID_READ}" ]]
then
IS_GROUP=1
fi
done
if [[ IS_GROUP -eq 0 ]]
then
textFail "Group ${GROUP_ID_READ} does not exist. Valid check groups are: ${GROUP_ID[*]}"
exit $EXITCODE
fi
# Iterate over every group removing echar comma
for GROUP_IDENTIFIER in ${GROUP_ID_READ//,/ }
do
# Iterate over every GroupID to find the belongin checks
for I in "${!GROUP_ID[@]}"
do
if [[ "${EXTRAS}" -eq 1 && "${GROUP_ID[I]}" == "extras" ]]
then
continue
else
if [[ "${GROUP_ID[I]}" == "${GROUP_IDENTIFIER}" ]]
then
# shellcheck disable=SC2068
for CHECK_IDENTIFIER in ${GROUP_CHECKS[I]//,/ }
do
# Include every check if not present
if [[ ! "${CHECK_LIST_BY_GROUP[*]}" =~ ${CHECK_IDENTIFIER} ]] && ! grep -E -w -q "${EXCLUDE_CHECK_ID//,/|}" <<< "${CHECK_IDENTIFIER}"
then
CHECK_LIST_BY_GROUP+=("${CHECK_IDENTIFIER}")
fi
done
fi
fi
done
done
TOTAL_CHECKS=("${CHECK_LIST_BY_GROUP[@]}")
# If -c option is set with checks
elif [ -n "${CHECK_ID}" ]
then
IFS=',' read -ra TOTAL_CHECKS <<< "${CHECK_ID}"
# If no option input to include/exclude checks
else
# if -e is passed we dont want extra checks
if [[ "${EXTRAS}" -eq 1 ]]
then
if [[ -n "$EXCLUDE_CHECK_ID" ]]
then
EXCLUDE_CHECK_ID="${EXCLUDE_CHECK_ID},${GROUP_CHECKS[7]}"
else
EXCLUDE_CHECK_ID="${GROUP_CHECKS[7]}"
fi
fi
# Always ignore the following checks if not supplied with -C
IGNORED_CHECKS="sample,extra9999"
EXCLUDE_CHECK_ID="${EXCLUDE_CHECK_ID/%/,}${IGNORED_CHECKS}"
for CHECK in "${PROWLER_DIR}"/checks/check*; do
# Relative path required to load every check
CHECK_DIR_NAME=$(basename "${CHECK}")
# Name of the check
CHECK_NAME=${CHECK_DIR_NAME//check_/}
# If a group identifier is passed
if ! grep -E -w -q "${EXCLUDE_CHECK_ID//,/|}" <<< "${CHECK_NAME}"
then
TOTAL_CHECKS+=("${CHECK_NAME}")
fi
done
fi
# Iterate over the final list of checks after being parsed all the input options to load the selected checks
for LOAD_PATH_CHECK in "${TOTAL_CHECKS[@]}"
do
# If the check is extra, the path needs to add check_ after the check name
if [[ "${LOAD_PATH_CHECK}" =~ 'extra' ]]
then
LOAD_PATH_CHECK=${LOAD_PATH_CHECK/#/check_}
fi
load_check "${LOAD_PATH_CHECK}"
done
}

View File

@@ -20,9 +20,6 @@
get_orgs_account_details(){
echo " Prowler is getting details from the AWS Organizations Management Account: ${MANAGEMENT_ACCOUNT_ID}..."
# Assume role to recover AWS Organizations metadata
assume_role
# The following code requires organizations:ListTagsForResource
ACCOUNTS_DETAILS=$($AWSCLI $PROFILE_OPT --region "${REGION}" organizations list-accounts --output json 2>&1)
if ! grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "${ACCOUNTS_DETAILS}"

View File

@@ -33,7 +33,7 @@ bsd_how_older_from_today() {
# output date format %Y-%m-%d
gnu_timestamp_to_date() {
# if date comes from cli v2 in format like 2020-04-29T10:13:09.191000-04:00, which is ISO8601
# remove fractions of a second
TIMESTAMP_TO_CONVERT=$(cut -f1 -d"." <<< "${1}")
if ! "${DATE_CMD}" -d "${TIMESTAMP_TO_CONVERT}" +%s > /dev/null 2>&1;
@@ -48,7 +48,7 @@ gnu_timestamp_to_date() {
}
bsd_timestamp_to_date() {
# if date comes from cli v2 in format like 2020-04-29T10:13:09.191000-04:00, which is ISO8601
# remove fractions of a second
TIMESTAMP_TO_CONVERT=$(cut -f1 -d"." <<< "${1}")
OUTPUT_DATE=$("${DATE_CMD}" -jf %Y-%m-%d "${TIMESTAMP_TO_CONVERT}" +%F 2>/dev/null)
@@ -111,12 +111,12 @@ gnu_convert_date_to_timestamp() {
# if [ "$OSTYPE" == "linux-musl" ]; then
# date -D "%Y-%m-%dT%H:%M:%SZ" -d "$1" +%s
# else
date -u -d "$1" +%s
"$DATE_CMD" -u -d "$1" +%s
# fi
}
bsd_convert_date_to_timestamp() {
echo $(date -u -j -f %Y-%m-%dT%H:%M:%S "$1" +%s)
"$DATE_CMD" -u -j -f %Y-%m-%dT%H:%M:%S "$1" +%s
#date -j -f "%Y-%m-%dT%H:%M:%S" "$1" "+%s"
}
@@ -145,65 +145,21 @@ bsd_replace_sed(){
sed -i '' $1 $2
}
# Functions to manage dates depending on OS
if [ "$OSTYPE" == "linux-gnu" ] || [ "$OSTYPE" == "linux-musl" ]; then
TEMP_REPORT_FILE=$(mktemp -t -p /tmp prowler.cred_report-XXXXXX)
# function to compare in days, usage how_older_from_today date
# date format %Y-%m-%d
how_older_from_today() {
gnu_how_older_from_today "$1"
}
timestamp_to_date() {
gnu_timestamp_to_date "$1"
}
decode_report() {
gnu_decode_report
}
how_many_days_from_today() {
gnu_how_many_days_from_today "$1"
}
get_date_previous_than_months() {
gnu_get_date_previous_than_months "$1"
}
get_time_in_milliseconds() {
gnu_get_time_in_milliseconds
}
get_iso8601_timestamp() {
gnu_get_iso8601_timestamp
}
test_tcp_connectivity() {
gnu_test_tcp_connectivity "$1" "$2" "$3"
}
convert_date_to_timestamp() {
gnu_convert_date_to_timestamp "$1"
}
replace_sed() {
gnu_replace_sed $1 $2
}
elif [[ "$OSTYPE" == "darwin"* ]] || [[ "$OSTYPE" == "freebsd"* ]]; then
# BSD/OSX commands compatibility
TEMP_REPORT_FILE=$(mktemp -t prowler.cred_report-XXXXXX)
# It is possible that the user has installed GNU coreutils on OS X. By default, this will make GNU commands
# available with a 'g' prefix, e.g. 'gdate'. Test if this is present, and use it if so, as it supports more features.
# The user also may have replaced the default Mac OS X BSD tools with the GNU coreutils equivalents.
# Only GNU date/base64 allows --version as a valid argument, so use the validity of this argument
# as a means to detect that coreutils is installed and is overriding the default tools
GDATE=$(which gdate)
if [ -n "${GDATE}" ]; then
DATE_CMD="gdate"
fi
GBASE64=$(which gbase64)
if [ -n "${GBASE64}" ]; then
BASE64_CMD="gbase64"
fi
if "$DATE_CMD" --version >/dev/null 2>&1 ; then
os_detector() {
# Functions to manage dates depending on OS
if [ "$OSTYPE" == "linux-gnu" ] || [ "$OSTYPE" == "linux-musl" ]; then
TEMP_REPORT_FILE=$(mktemp -t -p /tmp prowler.cred_report-XXXXXX)
# function to compare in days, usage how_older_from_today date
# date format %Y-%m-%d
how_older_from_today() {
gnu_how_older_from_today "$1"
}
timestamp_to_date() {
gnu_timestamp_to_date "$1"
}
decode_report() {
gnu_decode_report
}
how_many_days_from_today() {
gnu_how_many_days_from_today "$1"
}
@@ -216,83 +172,129 @@ elif [[ "$OSTYPE" == "darwin"* ]] || [[ "$OSTYPE" == "freebsd"* ]]; then
get_iso8601_timestamp() {
gnu_get_iso8601_timestamp
}
test_tcp_connectivity() {
gnu_test_tcp_connectivity "$1" "$2" "$3"
}
convert_date_to_timestamp() {
gnu_convert_date_to_timestamp "$1"
}
else
replace_sed() {
gnu_replace_sed $1 $2
}
elif [[ "$OSTYPE" == "darwin"* ]] || [[ "$OSTYPE" == "freebsd"* ]]; then
# BSD/OSX commands compatibility
TEMP_REPORT_FILE=$(mktemp -t prowler.cred_report-XXXXXX)
# It is possible that the user has installed GNU coreutils on OS X. By default, this will make GNU commands
# available with a 'g' prefix, e.g. 'gdate'. Test if this is present, and use it if so, as it supports more features.
# The user also may have replaced the default Mac OS X BSD tools with the GNU coreutils equivalents.
# Only GNU date/base64 allows --version as a valid argument, so use the validity of this argument
# as a means to detect that coreutils is installed and is overriding the default tools
GDATE=$(which gdate)
if [ -n "${GDATE}" ]; then
DATE_CMD="gdate"
fi
GBASE64=$(which gbase64)
if [ -n "${GBASE64}" ]; then
BASE64_CMD="gbase64"
fi
if "$DATE_CMD" --version >/dev/null 2>&1 ; then
how_older_from_today() {
gnu_how_older_from_today "$1"
}
timestamp_to_date() {
gnu_timestamp_to_date "$1"
}
how_many_days_from_today() {
gnu_how_many_days_from_today "$1"
}
get_date_previous_than_months() {
gnu_get_date_previous_than_months "$1"
}
get_time_in_milliseconds() {
gnu_get_time_in_milliseconds
}
get_iso8601_timestamp() {
gnu_get_iso8601_timestamp
}
convert_date_to_timestamp() {
gnu_convert_date_to_timestamp "$1"
}
else
how_older_from_today() {
bsd_how_older_from_today "$1"
}
timestamp_to_date() {
bsd_timestamp_to_date "$1"
}
how_many_days_from_today() {
bsd_how_many_days_from_today "$1"
}
get_date_previous_than_months() {
bsd_get_date_previous_than_months "$1"
}
get_time_in_milliseconds() {
bsd_get_time_in_milliseconds
}
get_iso8601_timestamp() {
bsd_get_iso8601_timestamp
}
convert_date_to_timestamp() {
bsd_convert_date_to_timestamp "$1"
}
fi
if "$BASE64_CMD" --version >/dev/null 2>&1 ; then
decode_report() {
gnu_decode_report
}
else
decode_report() {
bsd_decode_report
}
fi
test_tcp_connectivity() {
bsd_test_tcp_connectivity "$1" "$2" "$3"
}
replace_sed() {
bsd_replace_sed $1 $2
}
elif [[ "$OSTYPE" == "cygwin" ]]; then
# POSIX compatibility layer and Linux environment emulation for Windows
TEMP_REPORT_FILE=$(mktemp -t -p /tmp prowler.cred_report-XXXXXX)
how_older_from_today() {
bsd_how_older_from_today "$1"
gnu_how_older_from_today "$1"
}
timestamp_to_date() {
bsd_timestamp_to_date "$1"
gnu_timestamp_to_date "$1"
}
how_many_days_from_today() {
bsd_how_many_days_from_today "$1"
}
get_date_previous_than_months() {
bsd_get_date_previous_than_months "$1"
}
get_time_in_milliseconds() {
bsd_get_time_in_milliseconds
}
get_iso8601_timestamp() {
bsd_get_iso8601_timestamp
}
convert_date_to_timestamp() {
bsd_convert_date_to_timestamp "$1"
}
fi
if "$BASE64_CMD" --version >/dev/null 2>&1 ; then
decode_report() {
gnu_decode_report
}
else
decode_report() {
bsd_decode_report
how_many_days_from_today() {
gnu_how_many_days_from_today "$1"
}
get_date_previous_than_months() {
gnu_get_date_previous_than_months "$1"
}
get_time_in_milliseconds() {
gnu_get_time_in_milliseconds
}
get_iso8601_timestamp() {
gnu_get_iso8601_timestamp
}
test_tcp_connectivity() {
gnu_test_tcp_connectivity "$1" "$2" "$3"
}
convert_date_to_timestamp() {
gnu_convert_date_to_timestamp "$1"
}
replace_sed() {
gnu_replace_sed $1 $2
}
else
echo "Unknown Operating System! Valid \$OSTYPE: linux-gnu, linux-musl, darwin* or cygwin"
echo "Found: $OSTYPE"
EXITCODE=1
exit $EXITCODE
fi
test_tcp_connectivity() {
bsd_test_tcp_connectivity "$1" "$2" "$3"
}
replace_sed() {
bsd_replace_sed $1 $2
}
elif [[ "$OSTYPE" == "cygwin" ]]; then
# POSIX compatibility layer and Linux environment emulation for Windows
TEMP_REPORT_FILE=$(mktemp -t -p /tmp prowler.cred_report-XXXXXX)
how_older_from_today() {
gnu_how_older_from_today "$1"
}
timestamp_to_date() {
gnu_timestamp_to_date "$1"
}
decode_report() {
gnu_decode_report
}
how_many_days_from_today() {
gnu_how_many_days_from_today "$1"
}
get_date_previous_than_months() {
gnu_get_date_previous_than_months "$1"
}
get_time_in_milliseconds() {
gnu_get_time_in_milliseconds
}
get_iso8601_timestamp() {
gnu_get_iso8601_timestamp
}
test_tcp_connectivity() {
gnu_test_tcp_connectivity "$1" "$2" "$3"
}
convert_date_to_timestamp() {
gnu_convert_date_to_timestamp "$1"
}
replace_sed() {
gnu_replace_sed $1 $2
}
else
echo "Unknown Operating System! Valid \$OSTYPE: linux-gnu, linux-musl, darwin* or cygwin"
echo "Found: $OSTYPE"
EXITCODE=1
exit $EXITCODE
fi
}

View File

@@ -11,44 +11,12 @@
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
# Output formatting functions
EXTENSION_CSV="csv"
EXTENSION_JSON="json"
EXTENSION_ASFF="asff.json"
EXTENSION_TEXT="txt"
EXTENSION_HTML="html"
OUTPUT_DATE=$(date -u +"%Y%m%d%H%M%S")
OUTPUT_DIR="${PROWLER_DIR}/output" # default output if none
if [[ $OUTPUT_DIR_CUSTOM ]]; then
# output mode has to be set to other than text
if [[ ! " ${MODES[@]} " =~ " text " || ${check_id} == 7.1 || ${check_id} == 7.74 ]]; then
if [[ ! -d $OUTPUT_DIR_CUSTOM ]]; then
echo "$OPTRED ERROR!$OPTNORMAL directory \"$OUTPUT_DIR_CUSTOM\" does not exist."
exit 1
else
OUTPUT_DIR=$OUTPUT_DIR_CUSTOM
fi
else
echo "$OPTRED ERROR!$OPTNORMAL - Mode (-M) has to be set as well. Use -h for help."
exit 1
fi
fi
if [ -z ${OUTPUT_FILE_NAME+x} ]; then
OUTPUT_FILE_NAME="${OUTPUT_DIR}/prowler-output-${ACCOUNT_NUM}-${OUTPUT_DATE}"
else
OUTPUT_FILE_NAME="${OUTPUT_DIR}/$OUTPUT_FILE_NAME"
fi
HTML_LOGO_URL="https://github.com/prowler-cloud/prowler/"
HTML_LOGO_IMG="https://github.com/prowler-cloud/prowler/raw/master/util/html/prowler-logo-new.png"
TIMESTAMP=$(get_iso8601_timestamp)
PROWLER_PARAMETERS=$@
# Available parameters for outputs formats (implemented this in CSV from v2.4):
# Output format
# $PROFILE profile used to run Prowler (--profile in AWS CLI)
# $ACCOUNT_NUM AWS Account ID
# $REPREGION AWS region scanned
# $REGION_FROM_CHECK AWS region scanned
# $TITLE_ID Numeric identifier of each check (1.2, 2.3, etc), originally based on CIS checks.
# $CHECK_RESULT values can be PASS, FAIL, INFO or WARNING if allowlisted
# $ITEM_SCORED corresponds to CHECK_SCORED, values can be Scored/Not Scored. This is CIS only, will be deprecated in Prowler.
@@ -70,198 +38,237 @@ PROWLER_PARAMETERS=$@
# $ACCOUNT_DETAILS_ORG
# $ACCOUNT_DETAILS_TAGS
# Ensure that output directory always exists when -M is used
if [[ $MODE ]];then
mkdir -p "${OUTPUT_DIR}"
if [[ "${MODES[@]}" =~ "html" ]]; then
addHtmlHeader > ${OUTPUT_FILE_NAME}.$EXTENSION_HTML
HTML_REPORT_INIT="1"
fi
if [[ "${MODES[@]}" =~ "csv" ]]; then
printCsvHeader
fi
fi
EXTENSION_CSV="csv"
EXTENSION_JSON="json"
EXTENSION_ASFF="asff.json"
EXTENSION_TEXT="txt"
EXTENSION_HTML="html"
HTML_LOGO_URL="https://github.com/prowler-cloud/prowler/"
HTML_LOGO_IMG="https://github.com/prowler-cloud/prowler/raw/master/util/html/prowler-logo-new.png"
PROWLER_PARAMETERS=$@
# textInfo "HTML report will be saved: ${OUTPUT_FILE_NAME}.$EXTENSION_HTML"
# textInfo "JSON ASFF report will be saved: ${OUTPUT_FILE_NAME}.$EXTENSION_ASFF"
# textInfo "CSV report will be saved: ${OUTPUT_FILE_NAME}.$EXTENSION_CSV"
# textInfo "JSON report will be saved: ${OUTPUT_FILE_NAME}.$EXTENSION_JSON"
if [[ $PROFILE == "" ]];then
PROFILE="ENV"
fi
# Set exitcode for codebuild/jenkins-like integrations
set_exitcode () {
if [ "${FAILED_CHECK_FAILED_SCAN}" == 1 ] && [ -z "${FAILED_CHECK_FAILED_SCAN_LIST}" ]
then
EXITCODE=3
export EXITCODE
fi
if [[ "${FAILED_CHECK_FAILED_SCAN_LIST}" =~ ${CHECK_NAME} ]]
then
EXITCODE=3
export EXITCODE
fi
}
# Add header to certain output files readed from mode
output_files_init() {
OIFS="${IFS}"
IFS=','
for MODE_TYPE in ${MODE}
do
if [ "${MODE_TYPE}" == 'html' ]
then
addHtmlHeader
HTML_REPORT_INIT="1"
export HTML_REPORT_INIT
fi
if [ "${MODE_TYPE}" == 'csv' ]
then
printCsvHeader
fi
done
IFS="${OIFS}"
}
# Close HTML output file
output_files_end() {
if [[ "${MODE}" =~ "html" ]]; then
addHtmlFooter >> "${OUTPUT_FILE_NAME}.${EXTENSION_HTML}"
fi
}
# Check if resource checked is allowlisted
allowlist_check() {
# Checked value is the whole log message that comes as argument
CHECKED_VALUE=${1}
# Default value of result is Fail
CHECK_RESULT='FAIL'
## ignore allowlists for current check
while read -r excluded_item; do
# IGNORE_CHECK_NAME is the check with resources allowlisted
# RESOURCE_VALUE is what it comes after 'CHECK_NAME:'
IGNORE_CHECK_NAME=$(awk -F ":" '{print $1}' <<< "${excluded_item}")
RESOURCE_VALUE=$(awk -F "${CHECK_NAME}:" '{print $2}' <<< "${excluded_item}")
if [[ "${IGNORE_CHECK_NAME}" == "${CHECK_NAME}" ]]
then
if [[ "${CHECKED_VALUE}" =~ ${RESOURCE_VALUE} ]]
then
CHECK_RESULT="WARNING"
break
fi
fi
done <<< "$IGNORES"
echo "${CHECK_RESULT}"
}
is_quiet() {
IS_QUIET="${1}"
if [ "${IS_QUIET}" -eq 1 ]
then
return 0
else
return 1
fi
}
general_output() {
CHECK_RESULT="${1}"
CHECK_RESULT_EXTENDED="${2}"
CHECK_RESOURCE_ID="${3}"
REGION_FROM_CHECK="${4}"
COLOR_CODE="$OK"
# Check if color is different than normal (pass)
if [ "${CHECK_RESULT}" == 'INFO' ]
then
COLOR_CODE="${NOTICE}"
elif [ "${CHECK_RESULT}" == 'FAIL' ]
then
COLOR_CODE="${BAD}"
elif [ "${CHECK_RESULT}" == 'WARNING' ]
then
COLOR_CODE="${WARNING}"
fi
# Check if region is passed
if [ -z "${REGION_FROM_CHECK}" ]
then
REGION_FROM_CHECK=$REGION
fi
CSV_LINE="${PROFILE//,/--}${SEP}${ACCOUNT_NUM//,/--}${SEP}${REGION_FROM_CHECK//,/--}${SEP}${TITLE_ID//,/--}${SEP}${CHECK_RESULT//,/--}${SEP}${ITEM_SCORED//,/--}${SEP}${ITEM_CIS_LEVEL//,/--}${SEP}${TITLE_TEXT//,/--}${SEP}${CHECK_RESULT_EXTENDED//,/--}${SEP}${CHECK_ASFF_COMPLIANCE_TYPE//,/--}${SEP}${CHECK_SEVERITY//,/--}${SEP}${CHECK_SERVICENAME//,/--}${SEP}${CHECK_ASFF_RESOURCE_TYPE//,/--}${SEP}${CHECK_ASFF_TYPE//,/--}${SEP}${CHECK_RISK//,/--}${SEP}${CHECK_REMEDIATION//,/--}${SEP}${CHECK_DOC//,/--}${SEP}${CHECK_CAF_EPIC//,/--}${SEP}${CHECK_RESOURCE_ID//,/--}${SEP}${PROWLER_START_TIME//,/--}${SEP}${ACCOUNT_DETAILS_EMAIL//,/--}${SEP}${ACCOUNT_DETAILS_NAME//,/--}${SEP}${ACCOUNT_DETAILS_ARN//,/--}${SEP}${ACCOUNT_DETAILS_ORG//,/--}${SEP}${ACCOUNT_DETAILS_TAGS//,/--}"
# Iterating over input modes
OIFS="${IFS}"
IFS=','
for MODE_TYPE in ${MODE}
do
IFS="${OIFS}"
if [ "${MODE_TYPE}" == 'html' ]
then
generateHtmlOutput "${CHECK_RESULT_EXTENDED}" "${CHECK_RESULT}"
elif [ "${MODE_TYPE}" == 'csv' ]
then
echo "${CSV_LINE}" >> "${OUTPUT_FILE_NAME}"."${EXTENSION_CSV}"
elif [ "${MODE_TYPE}" == 'json' ]
then
generateJsonOutput "${CHECK_RESULT_EXTENDED}" "${CHECK_RESULT}" "$CHECK_RESOURCE_ID" >> "${OUTPUT_FILE_NAME}"."${EXTENSION_JSON}"
elif [ "${MODE_TYPE}" == 'json-asff' ]
then
JSON_ASFF_OUTPUT=$(generateJsonAsffOutput "${CHECK_RESULT_EXTENDED}" "${CHECK_RESULT}" "$CHECK_RESOURCE_ID")
echo "${JSON_ASFF_OUTPUT}" >> "${OUTPUT_FILE_NAME}"."${EXTENSION_ASFF}"
if [[ "${SEND_TO_SECURITY_HUB}" -eq 1 ]]; then
sendToSecurityHub "${JSON_ASFF_OUTPUT}" "${REGION_FROM_CHECK}"
fi
elif [[ "${MODE_TYPE}" == "junit-xml" ]]
then
if [ "${CHECK_RESULT}" == 'PASS' ]
then
output_junit_success "${CHECK_RESULT_EXTENDED}"
elif [ "${CHECK_RESULT}" == 'INFO' ]
then
output_junit_info "${CHECK_RESULT_EXTENDED}"
elif [ "${CHECK_RESULT}" == 'FAIL' ]
then
output_junit_failure "${CHECK_RESULT_EXTENDED}"
elif [ "${CHECK_RESULT}" == 'WARNING' ]
then
output_junit_skipped "${CHECK_RESULT_EXTENDED}"
fi
elif [ "${MODE_TYPE}" == 'mono' ]
then
echo " $COLOR_CODE ${CHECK_RESULT}! $NORMAL ${CHECK_RESULT_EXTENDED}">> "${OUTPUT_FILE_NAME}"."${EXTENSION_TEXT}"
fi
done
# Print in screen
echo " $COLOR_CODE ${CHECK_RESULT}!$NORMAL ${CHECK_RESULT_EXTENDED}"
#checking database provider
if [[ ${DATABASE_PROVIDER} == 'postgresql' ]]
then
postgresql_connector "${CSV_LINE}"
fi
}
textPass(){
CHECK_RESULT="PASS"
CHECK_RESULT_EXTENDED="$1"
CHECK_RESOURCE_ID="$3"
CHECK_RESULT_EXTENDED="${1}"
CHECK_RESOURCE_ID="${3}"
CHECK_REGION="${2}"
if [[ "$QUIET" == 1 ]]; then
PASS_COUNTER=$((PASS_COUNTER+1))
if is_quiet "${QUIET}"
then
return
fi
PASS_COUNTER=$((PASS_COUNTER+1))
if [[ $2 ]]; then
REPREGION=$2
else
REPREGION=$REGION
fi
if [[ "${MODES[@]}" =~ "csv" ]]; then
# Writting csv file replacing commas
echo "${PROFILE/,/--}${SEP}${ACCOUNT_NUM/,/--}${SEP}${REPREGION/,/--}${SEP}${TITLE_ID/,/--}${SEP}${CHECK_RESULT/,/--}${SEP}${ITEM_SCORED/,/--}${SEP}${ITEM_CIS_LEVEL/,/--}${SEP}${TITLE_TEXT/,/--}${SEP}${CHECK_RESULT_EXTENDED/,/--}${SEP}${CHECK_ASFF_COMPLIANCE_TYPE/,/--}${SEP}${CHECK_SEVERITY/,/--}${SEP}${CHECK_SERVICENAME/,/--}${SEP}${CHECK_ASFF_RESOURCE_TYPE/,/--}${SEP}${CHECK_ASFF_TYPE/,/--}${SEP}${CHECK_RISK/,/--}${SEP}${CHECK_REMEDIATION/,/--}${SEP}${CHECK_DOC/,/--}${SEP}${CHECK_CAF_EPIC/,/--}${SEP}${CHECK_RESOURCE_ID/,/--}${SEP}${PROWLER_START_TIME/,/--}${SEP}${ACCOUNT_DETAILS_EMAIL/,/--}${SEP}${ACCOUNT_DETAILS_NAME/,/--}${SEP}${ACCOUNT_DETAILS_ARN/,/--}${SEP}${ACCOUNT_DETAILS_ORG/,/--}${SEP}${ACCOUNT_DETAILS_TAGS/,/--}" >> ${OUTPUT_FILE_NAME}.$EXTENSION_CSV
fi
if [[ "${MODES[@]}" =~ "json" ]]; then
generateJsonOutput "$1" "Pass" "$CHECK_RESOURCE_ID" >> ${OUTPUT_FILE_NAME}.$EXTENSION_JSON
fi
if [[ "${MODES[@]}" =~ "json-asff" ]]; then
JSON_ASFF_OUTPUT=$(generateJsonAsffOutput "$1" "PASSED" "$CHECK_RESOURCE_ID")
echo "${JSON_ASFF_OUTPUT}" >> $OUTPUT_FILE_NAME.$EXTENSION_ASFF
if [[ "${SEND_TO_SECURITY_HUB}" -eq 1 ]]; then
sendToSecurityHub "${JSON_ASFF_OUTPUT}" "${REPREGION}"
fi
fi
if is_junit_output_enabled; then
output_junit_success "$1"
fi
if [[ "${MODES[@]}" =~ "mono" ]]; then
echo " $OK PASS!$NORMAL $1" >> ${OUTPUT_FILE_NAME}.$EXTENSION_TEXT
fi
# if [[ "${MODES[@]}" =~ "text" || "${MODES[@]}" =~ "mono" ]]; then
# runs showing console output no mater what output is selected
echo " $OK PASS!$NORMAL $1"
# fi
if [[ "${MODES[@]}" =~ "html" ]]; then
generateHtmlOutput "$1" "PASS"
fi
general_output "${CHECK_RESULT}" "${CHECK_RESULT_EXTENDED}" "${CHECK_RESOURCE_ID}" "${CHECK_REGION}"
}
textInfo(){
CHECK_RESULT="INFO"
CHECK_RESULT_EXTENDED="$1"
CHECK_RESOURCE_ID="$3"
CHECK_RESULT_EXTENDED="${1}"
CHECK_RESOURCE_ID="${3}"
CHECK_REGION="${2}"
if [[ "$QUIET" == 1 ]]; then
if is_quiet "${QUIET}"
then
return
fi
if [[ $2 ]]; then
REPREGION=$2
else
REPREGION=$REGION
fi
if [[ "${MODES[@]}" =~ "csv" ]]; then
# Writing csv file replacing commas
echo "${PROFILE/,/--}${SEP}${ACCOUNT_NUM/,/--}${SEP}${REPREGION/,/--}${SEP}${TITLE_ID/,/--}${SEP}${CHECK_RESULT/,/--}${SEP}${ITEM_SCORED/,/--}${SEP}${ITEM_CIS_LEVEL/,/--}${SEP}${TITLE_TEXT/,/--}${SEP}${CHECK_RESULT_EXTENDED/,/--}${SEP}${CHECK_ASFF_COMPLIANCE_TYPE/,/--}${SEP}${CHECK_SEVERITY/,/--}${SEP}${CHECK_SERVICENAME/,/--}${SEP}${CHECK_ASFF_RESOURCE_TYPE/,/--}${SEP}${CHECK_ASFF_TYPE/,/--}${SEP}${CHECK_RISK/,/--}${SEP}${CHECK_REMEDIATION/,/--}${SEP}${CHECK_DOC/,/--}${SEP}${CHECK_CAF_EPIC/,/--}${SEP}${CHECK_RESOURCE_ID/,/--}${SEP}${PROWLER_START_TIME/,/--}${SEP}${ACCOUNT_DETAILS_EMAIL/,/--}${SEP}${ACCOUNT_DETAILS_NAME/,/--}${SEP}${ACCOUNT_DETAILS_ARN/,/--}${SEP}${ACCOUNT_DETAILS_ORG/,/--}${SEP}${ACCOUNT_DETAILS_TAGS/,/--}" >> ${OUTPUT_FILE_NAME}.$EXTENSION_CSV
fi
if [[ "${MODES[@]}" =~ "json" ]]; then
generateJsonOutput "$1" "Info" "$CHECK_RESOURCE_ID" >> ${OUTPUT_FILE_NAME}.${EXTENSION_JSON}
fi
if is_junit_output_enabled; then
output_junit_info "$1"
fi
if [[ "${MODES[@]}" =~ "mono" ]]; then
echo " $NOTICE INFO! $1 $NORMAL" >> ${OUTPUT_FILE_NAME}.$EXTENSION_TEXT
fi
# if [[ "${MODES[@]}" =~ "text" ]]; then
echo " $NOTICE INFO! $1 $NORMAL"
# fi
if [[ "${MODES[@]}" =~ "html" ]]; then
generateHtmlOutput "$1" "INFO" "$CHECK_RESOURCE_ID"
fi
general_output "${CHECK_RESULT}" "${CHECK_RESULT_EXTENDED}" "${CHECK_RESOURCE_ID}" "${CHECK_REGION}"
}
textFail(){
## ignore allowlists for current check
level="FAIL"
colorcode="$BAD"
while read -r excluded_item; do
# ignore_check_name is the check with resources allowlisted
ignore_check_name=$(awk -F ":" '{print $1}' <<< "${excluded_item}")
# Resource value is what it comes after CHECK_NAME: :
resource_value=$(awk -F "$CHECK_NAME:" '{print $2}' <<< "${excluded_item}")
# Checked value is the whole log message that comes as argument
checked_value=$1
if [[ "${ignore_check_name}" != "${CHECK_NAME}" ]]; then
# not for this check
continue
fi
# To set WARNING flag checked_value have to include value of resource_value
# If it is treated as only expanse (*[]*) will not detect regex like [:alpha:]
if [[ "${checked_value}" =~ ${resource_value} ]]; then
level="WARNING"
colorcode="${WARNING}"
break
fi
done <<< "$IGNORES"
CHECK_RESULT='FAIL'
CHECK_RESULT_EXTENDED="${1}"
CHECK_RESOURCE_ID="${3}"
CHECK_REGION="${2}"
FAIL_COUNTER=$((FAIL_COUNTER+1))
# Check if resources are whitelisted
CHECK_RESULT=$(allowlist_check "${CHECK_RESULT_EXTENDED}")
# only set non-0 exit code on FAIL mode, WARN is ok
if [[ "$level" == "FAIL" ]]; then
FAIL_COUNTER=$((FAIL_COUNTER+1))
if [ "$FAILED_CHECK_FAILED_SCAN" == 1 ] && [ -z "$FAILED_CHECK_FAILED_SCAN_LIST" ] ; then
EXITCODE=3
fi
if [[ "${FAILED_CHECK_FAILED_SCAN_LIST[@]}" =~ "$CHECK_NAME" ]]; then
EXITCODE=3
fi
if [[ "${CHECK_RESULT}" == "FAIL" ]]
then
set_exitcode
fi
CHECK_RESULT=$level
CHECK_RESULT_EXTENDED="$1"
CHECK_RESOURCE_ID="$3"
if [[ $2 ]]; then
REPREGION=$2
else
REPREGION=$REGION
fi
if [[ "${MODES[@]}" =~ "csv" ]]; then
# Writing csv file replacing commas
echo "${PROFILE/,/--}${SEP}${ACCOUNT_NUM/,/--}${SEP}${REPREGION/,/--}${SEP}${TITLE_ID/,/--}${SEP}${CHECK_RESULT/,/--}${SEP}${ITEM_SCORED/,/--}${SEP}${ITEM_CIS_LEVEL/,/--}${SEP}${TITLE_TEXT/,/--}${SEP}${CHECK_RESULT_EXTENDED/,/--}${SEP}${CHECK_ASFF_COMPLIANCE_TYPE/,/--}${SEP}${CHECK_SEVERITY/,/--}${SEP}${CHECK_SERVICENAME/,/--}${SEP}${CHECK_ASFF_RESOURCE_TYPE/,/--}${SEP}${CHECK_ASFF_TYPE/,/--}${SEP}${CHECK_RISK/,/--}${SEP}${CHECK_REMEDIATION/,/--}${SEP}${CHECK_DOC/,/--}${SEP}${CHECK_CAF_EPIC/,/--}${SEP}${CHECK_RESOURCE_ID/,/--}${SEP}${PROWLER_START_TIME/,/--}${SEP}${ACCOUNT_DETAILS_EMAIL/,/--}${SEP}${ACCOUNT_DETAILS_NAME/,/--}${SEP}${ACCOUNT_DETAILS_ARN/,/--}${SEP}${ACCOUNT_DETAILS_ORG/,/--}${SEP}${ACCOUNT_DETAILS_TAGS/,/--}" >> ${OUTPUT_FILE_NAME}.$EXTENSION_CSV
fi
if [[ "${MODES[@]}" =~ "json" ]]; then
generateJsonOutput "$1" "${level}" "$CHECK_RESOURCE_ID">> ${OUTPUT_FILE_NAME}.${EXTENSION_JSON}
fi
if [[ "${MODES[@]}" =~ "json-asff" ]]; then
JSON_ASFF_OUTPUT=$(generateJsonAsffOutput "$1" "${level}" "$CHECK_RESOURCE_ID")
echo "${JSON_ASFF_OUTPUT}" >> ${OUTPUT_FILE_NAME}.${EXTENSION_ASFF}
if [[ "${SEND_TO_SECURITY_HUB}" -eq 1 ]]; then
sendToSecurityHub "${JSON_ASFF_OUTPUT}" "${REPREGION}"
fi
fi
if is_junit_output_enabled; then
if [[ "${level}" == "FAIL" ]]; then
output_junit_failure "$1"
elif [[ "${level}" == "WARNING" ]]; then
output_junit_skipped "$1"
fi
fi
if [[ "${MODES[@]}" =~ "mono" ]]; then
echo " $colorcode ${level}! $1 $NORMAL" >> ${OUTPUT_FILE_NAME}.$EXTENSION_TEXT
fi
# if [[ "${MODES[@]}" =~ "text" ]]; then
echo " $colorcode ${level}! $1 $NORMAL"
# fi
if [[ "${MODES[@]}" =~ "html" ]]; then
generateHtmlOutput "$1" "${level}" "$CHECK_RESOURCE_ID"
fi
general_output "${CHECK_RESULT}" "${CHECK_RESULT_EXTENDED}" "${CHECK_RESOURCE_ID}" "${CHECK_REGION}"
}
textTitle(){
TITLE_ID="${1}"
TITLE_TEXT="${2}"
CHECK_ASFF_COMP_TYPE="${6}"
CHECKS_COUNTER=$((CHECKS_COUNTER+1))
TITLE_ID="$1"
if [[ $NUMERAL ]]; then
# Left-pad the check ID with zeros to simplify sorting, e.g. 1.1 -> 1.01
TITLE_ID=$(awk -F'.' '{ printf "%d.%02d", $1, $2 }' <<< "$TITLE_ID")
fi
TITLE_TEXT=$2
local CHECK_SERVICENAME="$MAGENTA$3$NORMAL"
local CHECK_SEVERITY="$BROWN[$4]$NORMAL"
case "$6" in
case "${CHECK_ASFF_COMP_TYPE}" in
LEVEL1) ITEM_CIS_LEVEL="CIS Level 1";;
LEVEL2) ITEM_CIS_LEVEL="CIS Level 2";;
EXTRA) ITEM_CIS_LEVEL="Extra";;
@@ -269,23 +276,7 @@ textTitle(){
*) ITEM_CIS_LEVEL="Unspecified or Invalid";;
esac
local group_ids
# if [[ -n "$4" ]]; then
group_ids="$CYAN[$5]$NORMAL"
# fi
# if [[ "${MODES[@]}" =~ "csv" ]]; then
# >&2 echo "$TITLE_ID $TITLE_TEXT" >> ${OUTPUT_FILE_NAME}.${EXTENSION_CSV}
# elif [[ "${MODES[@]}" =~ "json" || "${MODES[@]}" =~ "json-asff" ]]; then
# :
# else
# # if [[ "$ITEM_SCORED" == "Scored" ]]; then
# echo -e "$TITLE_ID $CHECK_SERVICENAME $TITLE_TEXT $CHECK_SEVERITY $group_ids "
echo -e "$TITLE_ID $TITLE_TEXT - $CHECK_SERVICENAME $CHECK_SEVERITY"
# # else
# # echo -e "\n$PURPLE $TITLE_ID $TITLE_TEXT $6 $NORMAL $group_ids "
# # fi
# fi
echo -e "$TITLE_ID $TITLE_TEXT - $CHECK_SERVICENAME $CHECK_SEVERITY"
}
generateJsonOutput(){
@@ -302,7 +293,7 @@ generateJsonOutput(){
--arg SCORED "$ITEM_SCORED" \
--arg ITEM_CIS_LEVEL "$ITEM_CIS_LEVEL" \
--arg TITLE_ID "$TITLE_ID" \
--arg REPREGION "$REPREGION" \
--arg REGION_FROM_CHECK "$REGION_FROM_CHECK" \
--arg TYPE "$CHECK_ASFF_COMPLIANCE_TYPE" \
--arg TIMESTAMP "$(get_iso8601_timestamp)" \
--arg SERVICENAME "$CHECK_SERVICENAME" \
@@ -326,7 +317,7 @@ generateJsonOutput(){
"Scored": $SCORED,
"Level": $ITEM_CIS_LEVEL,
"Control ID": $TITLE_ID,
"Region": $REPREGION,
"Region": $REGION_FROM_CHECK,
"Timestamp": $TIMESTAMP,
"Compliance": $TYPE,
"Service": $SERVICENAME,
@@ -348,16 +339,18 @@ generateJsonAsffOutput(){
# Replace any successive non-conforming characters with a single underscore
local message=$1
local status=$2
#Checks to determine if the rule passes in a resource name that prowler uses to track the AWS Resource for allowlisting purposes
if [[ -z $3 ]]; then
local resource_id="NONE_PROVIDED"
else
local resource_id=$3
fi
if [[ "$status" == "FAIL" ]]; then
status="FAILED"
elif [[ "$status" == "PASS" ]]; then
status="PASSED"
fi
jq -M -c \
--arg ACCOUNT_NUM "$ACCOUNT_NUM" \
@@ -371,15 +364,15 @@ generateJsonAsffOutput(){
--arg TYPE "$CHECK_ASFF_COMPLIANCE_TYPE" \
--arg COMPLIANCE_RELATED_REQUIREMENTS "$CHECK_ASFF_COMPLIANCE_TYPE" \
--arg RESOURCE_TYPE "$CHECK_ASFF_RESOURCE_TYPE" \
--arg REPREGION "$REPREGION" \
--arg REGION_FROM_CHECK "$REGION_FROM_CHECK" \
--arg TIMESTAMP "$(get_iso8601_timestamp)" \
--arg PROWLER_VERSION "$PROWLER_VERSION" \
--arg AWS_PARTITION "$AWS_PARTITION" \
--arg CHECK_RESOURCE_ID "$resource_id" \
-n '{
"SchemaVersion": "2018-10-08",
"Id": "prowler-\($TITLE_ID)-\($ACCOUNT_NUM)-\($REPREGION)-\($UNIQUE_ID)",
"ProductArn": "arn:\($AWS_PARTITION):securityhub:\($REPREGION)::product/prowler/prowler",
"Id": "prowler-\($TITLE_ID)-\($ACCOUNT_NUM)-\($REGION_FROM_CHECK)-\($UNIQUE_ID)",
"ProductArn": "arn:\($AWS_PARTITION):securityhub:\($REGION_FROM_CHECK)::product/prowler/prowler",
"RecordState": "ACTIVE",
"ProductFields": {
"ProviderName": "Prowler",
@@ -404,14 +397,14 @@ generateJsonAsffOutput(){
"Type": $RESOURCE_TYPE,
"Id": $CHECK_RESOURCE_ID,
"Partition": $AWS_PARTITION,
"Region": $REPREGION
"Region": $REGION_FROM_CHECK
}
],
"Compliance": {
"Status": $STATUS,
"RelatedRequirements": [ $COMPLIANCE_RELATED_REQUIREMENTS ]
}
}'
}
@@ -431,14 +424,14 @@ generateHtmlOutput(){
if [[ $status == "WARN" ]];then
local ROW_CLASS='table-warning'
fi
local CHECK_SEVERITY="$(echo $CHECK_SEVERITY | sed 's/[][]//g')"
echo '<tr class="'$ROW_CLASS'">' >> ${OUTPUT_FILE_NAME}.$EXTENSION_HTML
echo ' <td>'$status'</td>' >> ${OUTPUT_FILE_NAME}.$EXTENSION_HTML
echo ' <td>'$CHECK_SEVERITY'</td>' >> ${OUTPUT_FILE_NAME}.$EXTENSION_HTML
echo ' <td>'$ACCOUNT_NUM'</td>' >> ${OUTPUT_FILE_NAME}.$EXTENSION_HTML
echo ' <td>'$REPREGION'</td>' >> ${OUTPUT_FILE_NAME}.$EXTENSION_HTML
echo ' <td>'$REGION_FROM_CHECK'</td>' >> ${OUTPUT_FILE_NAME}.$EXTENSION_HTML
echo ' <td>'$CHECK_ASFF_COMPLIANCE_TYPE'</td>' >> ${OUTPUT_FILE_NAME}.$EXTENSION_HTML
echo ' <td>'$CHECK_SERVICENAME'</td>' >> ${OUTPUT_FILE_NAME}.$EXTENSION_HTML
echo ' <td>'$TITLE_ID'</td>' >> ${OUTPUT_FILE_NAME}.$EXTENSION_HTML
@@ -452,4 +445,4 @@ generateHtmlOutput(){
echo ' <td>'$CHECK_RESOURCE_ID'</td>' >> ${OUTPUT_FILE_NAME}.$EXTENSION_HTML
echo '</tr>' >> ${OUTPUT_FILE_NAME}.$EXTENSION_HTML
echo '' >> ${OUTPUT_FILE_NAME}.$EXTENSION_HTML
}
}

View File

@@ -11,21 +11,8 @@
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
if [[ $OUTPUT_BUCKET ]]; then
# output mode has to be set to other than text
if [[ "${MODES[*]}" =~ "text" ]]; then
echo "$OPTRED ERROR!$OPTNORMAL - Mode (-M) can't be text when using custom output bucket. Use -h for help."
exit 1
else
# need to make sure last / is not set to avoid // in S3
if [[ $OUTPUT_BUCKET == *"/" ]]; then
OUTPUT_BUCKET=${OUTPUT_BUCKET::-1}
fi
fi
fi
copyToS3() {
# Prowler will copy each format to its own folder in S3, that is for better handling
# Prowler will copy each format to its own folder in S3, that is for better handling
# and processing by Quicksight or others.
# Also, check if -F was introduced
if [ -n "${OUTPUT_FILE_NAME+x}" ]; then
@@ -34,9 +21,12 @@ copyToS3() {
OUTPUT_PATH="$OUTPUT_DIR/prowler-output-${ACCOUNT_NUM}-${OUTPUT_DATE}"
fi
for output_format in "${MODES[@]}";
OIFS="${IFS}"
IFS=','
for MODE_TYPE in ${MODE}
do
case ${output_format} in
IFS="${OIFS}"
case ${MODE_TYPE} in
csv)
s3cp "${OUTPUT_PATH}" "${EXTENSION_CSV}"
;;

View File

@@ -1,32 +0,0 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2019) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
# detector of python and boto3
pythonDetector(){
PYTHON_BIN=$(which python)
PYTHON_PIP_BOTO3=$(pip list|grep boto3)
if [ -z "${PYTHON_BIN}" ]; then
echo -e "\n$RED ERROR!$NORMAL python not found. Make sure it is installed correctly and in your \$PATH\n"
EXITCODE=1
exit $EXITCODE
else
PYTHON_INSTALLED=1
if [ -z "${PYTHON_PIP_BOTO3}" ]; then
echo -e "\n$RED ERROR!$NORMAL python library boto3 not found. Make sure it is installed correctly\n"
EXITCODE=1
exit $EXITCODE
else
PYTHON_PIP_BOTO3_INSTALLED=1
fi
fi
}

149
include/quick_inventory Normal file
View File

@@ -0,0 +1,149 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2021) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
###############################################################################
# Probably the easier and quickest way to get a resouce inventory in AWS.
# Lists (almost) all resources in all regions based on:
# https://docs.aws.amazon.com/resourcegroupstagging/latest/APIReference/API_GetResources.html
# this list is not accurate https://docs.aws.amazon.com/ARG/latest/userguide/supported-resources.html
# don't fully trust that list column "Tag-based Groups".
# as far as I can see it doesn't include:
# CodeBuild build projects like arn:aws:codebuild:eu-west-1:123456789012:build/Abcdefg
# despite the supported-resources documentation page
# Requires IAM permissions:
# resource-groups:Get*
# resource-groups:GroupResources
# Quick inventory part
quick_inventory(){
# Create a temporary file to store all resources arn
TEMP_INVENTORY_FILE=$(mktemp -t prowler-inventory-${ACCOUNT_NUM}.XXXXXXXXXX)
INVENTORY_FILE="$OUTPUT_DIR/prowler-inventory-${ACCOUNT_NUM}-${OUTPUT_DATE}.csv"
echo "-=- Running the inventory for AWS Account ${BLUE}${ACCOUNT_NUM}${NORMAL} - ${BLUE}BETA FEATURE${NORMAL} -=-"
# get all resources per region and getting only their ARN
echo -ne "\n # Counting IAM resources in default region $REGION... "
# since this api doesn't get IAM properly, first we get it directly:
# get IAM_ROLES
$AWSCLI iam list-roles --query "Roles[].Arn" \
$PROFILE_OPT \
--region $REGION \
--output json | jq -r '.[]' >> "${TEMP_INVENTORY_FILE}"
# get IAM_USERS
$AWSCLI iam list-users --query "Users[].Arn" \
$PROFILE_OPT \
--region $REGION \
--output json \
--query "Users[].Arn" | jq -r '.[]' >> "${TEMP_INVENTORY_FILE}"
# get IAM_GROUPS
$AWSCLI iam list-groups --query "Groups[].Arn" \
$PROFILE_OPT \
--region $REGION \
--output json | jq -r '.[]' >> "${TEMP_INVENTORY_FILE}"
# get IAM_MANAGED_POLICIES
$AWSCLI iam list-policies --query "Policies[].Arn" \
$PROFILE_OPT \
--region $REGION \
--output json | jq -r '.[]' | grep :${ACCOUNT_NUM}: >> "${TEMP_INVENTORY_FILE}"
# get SAML PROVIDERS
$AWSCLI iam list-saml-providers --query "SAMLProviderList[].Arn" \
$PROFILE_OPT \
--region $REGION \
--output json | jq -r '.[]' >> "${TEMP_INVENTORY_FILE}"
echo -e "${OK}DONE${NORMAL}"
# get other resources using resourcegroupstaggingapi get-resources
echo -e "\n # Counting rest of resources by region..."
for region in $REGIONS; do
#(
echo -ne " - $region: "
AWS_RESOURCES=$($AWSCLI resourcegroupstaggingapi get-resources \
$PROFILE_OPT \
--region $region \
--query ResourceTagMappingList[].ResourceARN \
--output json 2>&1)
# error handling
if grep -q -E 'service control policy' <<< "${AWS_RESOURCES}"; then
echo -e "${BAD}blocked by a SCP${NORMAL}"
continue
fi
if grep -q -E 'AccessDenied|UnauthorizedOperation|AuthorizationError' <<< "$AWS_RESOURCES"; then
echo -e "${BAD}Access Denied trying to get AWS Resources${NORMAL}"
continue
fi
jq -r '.[]' <<< "${AWS_RESOURCES}" >> "${TEMP_INVENTORY_FILE}-${region}"
TOTAL_RESOURCES_FOUND_REGION=$(awk 'END { print NR }' "${TEMP_INVENTORY_FILE}-${region}")
if [ "${region}" == "us-east-1" ]; then
TOTAL_IAM_RESOURCES_FOUND=$(grep -c :"iam": "${TEMP_INVENTORY_FILE}-${region}")
TOTAL_RESOURCES_FOUND_REGION=$(("${TOTAL_RESOURCES_FOUND_REGION}-${region}"+${TOTAL_IAM_RESOURCES_FOUND}))
fi
echo -e "${OK}${TOTAL_RESOURCES_FOUND_REGION}${NORMAL} resources!"
cat "${TEMP_INVENTORY_FILE}-${region}" >> ${TEMP_INVENTORY_FILE}
cleanInventoryTemporaryFilesByRegion
#) &
done
#wait
# Generate file in CSV format
# send header first
echo "aws_partition,aws_service,aws_region,aws_account_id,subservice_resource,arn" > "${INVENTORY_FILE}"
while IFS=: read -r arn aws_partition aws_service aws_region aws_account_id subservice_resource; do
echo "$aws_partition,$aws_service,$aws_region,$aws_account_id,$subservice_resource,$aws_partition:$aws_service:$aws_region:$aws_account_id:$subservice_resource"
done < "${TEMP_INVENTORY_FILE}" >> "${INVENTORY_FILE}"
# Show resources found per service
echo -e "\n-=- List of type of resources per service for AWS Account ${OK}${ACCOUNT_NUM}${NORMAL} -=-\n"
LIST_OF_DETECTED_SERVICES=$(grep -v subservice_resource < "${INVENTORY_FILE}" | awk -F',' '{ print $2 }' | sort -u )
for aws_service in $LIST_OF_DETECTED_SERVICES;do
echo "-=- ${YELLOW}$aws_service${NORMAL} service -=-"
if [[ $aws_service == 's3' ]];then
TOTAL_S3_BUCKETS=$(grep ",$aws_service," < "${INVENTORY_FILE}" | awk -F',' '{ print $5 }' | wc -l | xargs)
echo "$TOTAL_S3_BUCKETS buckets"
else
TOTAL_RESOURCE_SERVICE=$(grep ",$aws_service," < "${INVENTORY_FILE}" | awk -F',' '{ print $5 }' | awk -F'/' '{ print $1 }' | awk -F':' '{ print $1 }' | sort | uniq -c)
echo "$TOTAL_RESOURCE_SERVICE"
fi
done
TOTAL_RESOURCES_FOUND=$(awk 'END { print NR - 1 }' ${INVENTORY_FILE})
echo -e "\nTotal resources found: ${OK}$TOTAL_RESOURCES_FOUND${NORMAL}\n"
echo -e "More details in file: ${OK}${INVENTORY_FILE}${NORMAL}\n"
# Clean temp file
cleanInventoryTemporaryFiles
}
cleanInventoryTemporaryFiles() {
rm -fr "${TEMP_INVENTORY_FILE}"
}
cleanInventoryTemporaryFilesByRegion() {
rm -fr "${TEMP_INVENTORY_FILE}-${region}"
}

View File

@@ -54,8 +54,8 @@ scoring(){
echo -e "$BLUE------------------------------------------------------------------ $NORMAL"
echo -e " * the higher the better (0 to 100)$NORMAL"
echo -e " Prowler scoring uses any check, including CIS not scored checks$NORMAL"
fi
if [[ "${MODES[@]}" =~ "html" ]]; then
fi
if [[ "${MODE}" =~ "html" ]]; then
replace_sed 's/PROWLER_SCORE/'$PROWLER_SCORE'/g' ${OUTPUT_FILE_NAME}.$EXTENSION_HTML
replace_sed 's/PASS_COUNTER/'$PASS_COUNTER'/g' ${OUTPUT_FILE_NAME}.$EXTENSION_HTML
replace_sed 's/TOTAL_RESOURCES/'$TOTAL_RESOURCES'/g' ${OUTPUT_FILE_NAME}.$EXTENSION_HTML

View File

@@ -13,41 +13,40 @@
# detector and configuration of detect-secrets
secretsDetector(){
PYTHON_PIP_DETECTSECRETS=$(which detect-secrets)
if [ -z "${PYTHON_PIP_DETECTSECRETS}" ]; then
if ! command -v "detect-secrets" &>/dev/null
then
echo -e "\n$RED ERROR!$NORMAL python library detect-secrets not found. Make sure it is installed correctly and in your \$PATH\n"
EXITCODE=241
exit $EXITCODE
else
SECRETS_TEMP_FOLDER="$PROWLER_DIR/secrets-$ACCOUNT_NUM"
if [[ ! -d $SECRETS_TEMP_FOLDER ]]; then
mkdir $SECRETS_TEMP_FOLDER
if [[ ! -d "${SECRETS_TEMP_FOLDER}" ]]; then
mkdir "${SECRETS_TEMP_FOLDER}"
fi
PYTHON_PIP_DETECTSECRETS_INSTALLED=1
# Sets the entropy limit for high entropy base64 strings from
# environment variable BASE64_LIMIT.
# Sets the entropy limit for high entropy base64 strings from environment variable BASE64_LIMIT.
# Value must be between 0.0 and 8.0, defaults is 4.5.
# Sets the entropy limit for high entropy hex strings from
# environment variable HEX_LIMIT.
# Sets the entropy limit for high entropy hex strings from environment variable HEX_LIMIT.
# Value must be between 0.0 and 8.0, defaults is 3.0.
case $1 in
file )
# this is to scan a file
detect-secrets scan --hex-limit ${HEX_LIMIT:-3.0} --base64-limit ${BASE64_LIMIT:-4.5} $2 | \
jq -r '.results[]|.[] | [.line_number, .type]|@csv' | wc -l
#jq -r '.results[] | .[] | "\(.line_number)\t\(.type)"'
detect-secrets scan --hex-limit "${HEX_LIMIT:-3.0}" --base64-limit "${BASE64_LIMIT:-4.5}" "${2}" \
| jq -r '.results[]|.[] | [.line_number, .type]|@csv' \
| wc -l
# jq -r '.results[] | .[] | "\(.line_number)\t\(.type)"'
# this command must return values in two colums:
# line in file and type of secrets found
;;
string )
# this is to scan a given string
detect-secrets scan --hex-limit ${HEX_LIMIT:-3.0} --base64-limit ${BASE64_LIMIT:-4.5} --string $2 | \
grep True| wc -l
detect-secrets scan --hex-limit "${HEX_LIMIT:-3.0}" --base64-limit "${BASE64_LIMIT:-4.5}" --string "${2}" \
| grep True | wc -l
;;
folder )
# this is to scan a given folder with all lambda files
detect-secrets scan --hex-limit ${HEX_LIMIT:-3.0} --base64-limit ${BASE64_LIMIT:-4.5} --all-files $2 | \
jq -r '.results[]|.[] | [.line_number, .type]|@csv' | wc -l
detect-secrets scan --hex-limit "${HEX_LIMIT:-3.0}" --base64-limit "${BASE64_LIMIT:-4.5}" --all-files "${2}" \
| jq -r '.results[]|.[] | [.line_number, .type]|@csv' | wc -l
;;
esac
fi

View File

@@ -16,7 +16,7 @@
checkSecurityHubCompatibility(){
local regx
if [[ "${MODE}" != "json-asff" ]]; then
if [[ ! "${MODE}" =~ "json-asff" ]]; then
echo -e "\n$RED ERROR!$NORMAL Output can only be sent to Security Hub when the output mode is json-asff, i.e. -M json-asff -S\n"
EXITCODE=1
exit $EXITCODE
@@ -71,7 +71,7 @@ resolveSecurityHubPreviousFails(){
if [[ $SECURITY_HUB_PREVIOUS_FINDINGS != "[]" ]]; then
FINDINGS_COUNT=$(echo $SECURITY_HUB_PREVIOUS_FINDINGS | jq '. | length')
for i in $(seq 0 50 $FINDINGS_COUNT);
do
do
BATCH_FINDINGS=$(echo $SECURITY_HUB_PREVIOUS_FINDINGS | jq -c '.['"$i:$i+50"']')
BATCH_FINDINGS_COUNT=$(echo $BATCH_FINDINGS | jq '. | length')
if [ "$BATCH_FINDINGS_COUNT" -gt 0 ]; then
@@ -93,11 +93,10 @@ sendToSecurityHub(){
local finding_id=$(echo ${findings} | jq -r .Id )
SECURITYHUB_NEW_FINDINGS_IDS+=( "$finding_id" )
BATCH_IMPORT_RESULT=$($AWSCLI securityhub --region "$region" $PROFILE_OPT batch-import-findings --findings "${findings}")
BATCH_IMPORT_RESULT=$($AWSCLI securityhub --region "$region" $PROFILE_OPT batch-import-findings --findings "${findings}" 2>&1)
# Check for success if imported
if [[ -z "${BATCH_IMPORT_RESULT}" ]] || ! jq -e '.SuccessCount == 1' <<< "${BATCH_IMPORT_RESULT}" > /dev/null 2>&1; then
echo -e "\n$RED ERROR!$NORMAL Failed to send check output to AWS Security Hub\n"
echo "${BAD} ERROR! Failed to send check output to AWS Security Hub.${NORMAL}"
fi
}

60
include/show_titles Normal file
View File

@@ -0,0 +1,60 @@
#!/usr/bin/env bash
# Copyright 2018 Toni de la Fuente
# Prowler is a tool that provides automate auditing and hardening guidance of an
# AWS account. It is based on AWS-CLI commands. It follows some guidelines
# present in the CIS Amazon Web Services Foundations Benchmark at:
# https://d0.awsstatic.com/whitepapers/compliance/AWS_CIS_Foundations_Benchmark.pdf
# Contact the author at https://blyx.com/contact
# and open issues or ask questions at https://github.com/prowler-cloud/prowler
# Code is licensed as Apache License 2.0 as specified in
# each file. You may obtain a copy of the License at
# http://www.apache.org/licenses/LICENSE-2.0
# Show the title of a group, by their numerical id
show_group_title() {
textTitle "${GROUP_NUMBER[$1]}" "${GROUP_TITLE[$1]}"
}
# Show al group titles
show_all_group_titles() {
local group_index
for group_index in "${!GROUP_TITLE[@]}"; do
show_group_title "$group_index"
done
exit 0
}
# Show the titles of either all checks or only those in the specified group
show_all_check_titles() {
for CHECK_ID in ${TOTAL_CHECKS[*]}
do
show_check_title "${CHECK_ID}"
done
exit 0
}
# Function to show the title of the check, and optionally which group(s) it belongs to
# using this way instead of arrays to keep bash3 (osx) and bash4(linux) compatibility
show_check_title() {
local check_id=CHECK_ID_$1
local check_title=CHECK_TITLE_$1
local check_scored=CHECK_SCORED_$1
local check_cis_level=CHECK_CIS_LEVEL_$1
local check_asff_compliance_type=CHECK_ASFF_COMPLIANCE_TYPE_$1
local check_severity=CHECK_SEVERITY_$1
local check_servicename=CHECK_SERVICENAME_$1
local group_ids
local group_index
# This shows ASFF_COMPLIANCE_TYPE if group used is ens, this si used to show ENS compliance ID control, can be used for other compliance groups as well.
if [[ ${GROUP_ID_READ} == "ens" ]];then
textTitle "${!check_id}" "${!check_title}" "${!check_scored}" "${!check_cis_level}" "$group_ids" "(${!check_asff_compliance_type})"
else
textTitle "${!check_id}" "${!check_title}" "${!check_servicename}" "${!check_severity}" "$group_ids" "${!check_cis_level}"
fi
}

Some files were not shown because too many files have changed in this diff Show More