mirror of
https://github.com/prowler-cloud/prowler.git
synced 2026-01-25 02:08:11 +00:00
Add the ability to generate JUnit XML reports with a -J flag
If the -J flag is passed, generate JUnit XML reports for each check, in-line with how Java tools generate JUnit reports. Check section numbers equate to 'root packages', checks are second-level packages, each check equates to a testsuite (mirroring Java where each test class is a testsuite) and each pass/fail of a check equates to a testcase Time the execution of each check and include this in the report Include properties (Prowler version, check level etc.) in-line with standard JUnit files XML escape all strings for safety Detect if a user has GNU coreutils installed on Mac OS X, but not as their default, switching to using gdate for date commands if so, as it has more features, including getting dates in milliseconds Add prowler-output, junit-reports and VSCode files to .gitignore Update README to include JUnit info, address markdownlint warnings Remove unused arguments to jq in generateJsonAsffOutput Fixes #537
This commit is contained in:
9
.gitignore
vendored
9
.gitignore
vendored
@@ -19,3 +19,12 @@ tags
|
||||
|
||||
# MacOs DS_Store
|
||||
*.DS_Store
|
||||
|
||||
# Prowler output
|
||||
prowler-output-*
|
||||
|
||||
# JUnit Reports
|
||||
junit-reports/
|
||||
|
||||
# VSCode files
|
||||
.vscode/
|
||||
|
||||
75
README.md
75
README.md
@@ -45,7 +45,6 @@ Read more about [CIS Amazon Web Services Foundations Benchmark v1.2.0 - 05-23-20
|
||||
- HIPAA [hipaa] Read more [here](#hipaa-checks)
|
||||
- Trust Boundaries [trustboundaries] Read more [here](#trustboundaries-checks)
|
||||
|
||||
|
||||
With Prowler you can:
|
||||
|
||||
- get a colorful or monochrome report
|
||||
@@ -68,6 +67,7 @@ This script has been written in bash using AWS-CLI and it works in Linux and OSX
|
||||
AWS-CLI can be also installed it using "brew", "apt", "yum" or manually from <https://aws.amazon.com/cli/>, but `ansi2html` and `detect-secrets` has to be installed using `pip`. You will need to install `jq` to get more accuracy in some checks.
|
||||
|
||||
- Make sure jq is installed (example below with "apt" but use a valid package manager for your OS):
|
||||
|
||||
```sh
|
||||
sudo apt install jq
|
||||
```
|
||||
@@ -84,7 +84,9 @@ This script has been written in bash using AWS-CLI and it works in Linux and OSX
|
||||
```sh
|
||||
aws configure
|
||||
```
|
||||
|
||||
or
|
||||
|
||||
```sh
|
||||
export AWS_ACCESS_KEY_ID="ASXXXXXXX"
|
||||
export AWS_SECRET_ACCESS_KEY="XXXXXXXXX"
|
||||
@@ -110,7 +112,7 @@ This script has been written in bash using AWS-CLI and it works in Linux and OSX
|
||||
|
||||
Use `-l` to list all available checks and group of checks (sections)
|
||||
|
||||
If you want to avoid installing dependences run it using Docker:
|
||||
If you want to avoid installing dependencies run it using Docker:
|
||||
|
||||
```sh
|
||||
docker run -ti --rm --name prowler --env AWS_ACCESS_KEY_ID --env AWS_SECRET_ACCESS_KEY --env AWS_SESSION_TOKEN toniblyx/prowler:latest
|
||||
@@ -127,16 +129,21 @@ This script has been written in bash using AWS-CLI and it works in Linux and OSX
|
||||
```sh
|
||||
./prowler -c check310
|
||||
```
|
||||
|
||||
With Docker:
|
||||
|
||||
```sh
|
||||
docker run -ti --rm --name prowler --env AWS_ACCESS_KEY_ID --env AWS_SECRET_ACCESS_KEY --env AWS_SESSION_TOKEN toniblyx/prowler:latest "-c check310"
|
||||
```
|
||||
|
||||
or multiple checks separated by comma:
|
||||
|
||||
```sh
|
||||
./prowler -c check310,check722
|
||||
```
|
||||
|
||||
or all checks but some of them:
|
||||
|
||||
```sh
|
||||
./prowler -E check42,check43
|
||||
```
|
||||
@@ -152,7 +159,9 @@ This script has been written in bash using AWS-CLI and it works in Linux and OSX
|
||||
```sh
|
||||
./prowler -g group1 # for iam related checks
|
||||
```
|
||||
|
||||
or exclude some checks in the group:
|
||||
|
||||
```sh
|
||||
./prowler -g group4 -E check42,check43
|
||||
```
|
||||
@@ -166,11 +175,15 @@ This script has been written in bash using AWS-CLI and it works in Linux and OSX
|
||||
```sh
|
||||
./prowler -M csv
|
||||
```
|
||||
|
||||
or with multiple formats at the same time:
|
||||
|
||||
```sh
|
||||
./prowler -M csv,json,json-asff
|
||||
```
|
||||
|
||||
or just a group of checks in multiple formats:
|
||||
|
||||
```sh
|
||||
./prowler -g gdpr -M csv,json,json-asff
|
||||
```
|
||||
@@ -190,6 +203,12 @@ This script has been written in bash using AWS-CLI and it works in Linux and OSX
|
||||
./prowler | ansi2html -la > report.html
|
||||
```
|
||||
|
||||
To generate JUnit report files add `-J`. This can be combined with any format. Files are written inside a prowler root directory named `junit-reports`:
|
||||
|
||||
```sh
|
||||
./prowler -J
|
||||
```
|
||||
|
||||
>Note about output formats to use with `-M`: "text" is the default one with colors, "mono" is like default one but monochrome, "csv" is comma separated values, "json" plain basic json (without comma between lines) and "json-asff" is also json with Amazon Security Finding Format that you can ship to Security Hub using `-S`.
|
||||
|
||||
or save your report in a S3 bucket (this only works for text or mono, for csv, json or json-asff it has to be copied afterwards):
|
||||
@@ -213,7 +232,7 @@ This script has been written in bash using AWS-CLI and it works in Linux and OSX
|
||||
|
||||
1. For help use:
|
||||
|
||||
```
|
||||
```sh
|
||||
./prowler -h
|
||||
|
||||
USAGE:
|
||||
@@ -243,6 +262,7 @@ This script has been written in bash using AWS-CLI and it works in Linux and OSX
|
||||
-V show version number & exit
|
||||
-s show scoring report
|
||||
-S send check output to AWS Security Hub - only valid when the output mode is json-asff (i.e. "-M json-asff -S")
|
||||
-J generate JUnit reports, readable by Jenkins or other CI tools. Files are written to ./junit-reports
|
||||
-x specify external directory with custom checks (i.e. /my/own/checks, files must start by check)
|
||||
-q suppress info messages and passing test output
|
||||
-A account id for the account where to assume a role, requires -R and -T
|
||||
@@ -261,11 +281,11 @@ This script has been written in bash using AWS-CLI and it works in Linux and OSX
|
||||
|
||||
Prowler uses the AWS CLI underneath so it uses the same authentication methods. However, there are few ways to run Prowler against multiple accounts using IAM Assume Role feature depending on eachg use case. You can just set up your custom profile inside `~/.aws/config` with all needed information about the role to assume then call it with `./prowler -p your-custom-profile`. Additionally you can use `-A 123456789012` and `-R RemoteRoleToAssume` and Prowler will get those temporary credentials using `aws sts assume-role`, set them up as environment variables and run against that given account.
|
||||
|
||||
```
|
||||
```sh
|
||||
./prowler -A 123456789012 -R ProwlerRole
|
||||
```
|
||||
|
||||
```
|
||||
```sh
|
||||
./prowler -A 123456789012 -R ProwlerRole -I 123456
|
||||
```
|
||||
|
||||
@@ -275,11 +295,11 @@ Prowler uses the AWS CLI underneath so it uses the same authentication methods.
|
||||
|
||||
For example, if you want to get only the fails in CSV format from all checks regarding RDS without banner from the AWS Account 123456789012 assuming the role RemoteRoleToAssume and set a fixed session duration of 1h:
|
||||
|
||||
```
|
||||
```sh
|
||||
./prowler -A 123456789012 -R RemoteRoleToAssume -T 3600 -b -M cvs -q -g rds
|
||||
```
|
||||
|
||||
```
|
||||
```sh
|
||||
./prowler -A 123456789012 -R RemoteRoleToAssume -T 3600 -I 123456 -b -M cvs -q -g rds
|
||||
```
|
||||
|
||||
@@ -304,17 +324,18 @@ Flag `-x /my/own/checks` will include any check in that particular directory. To
|
||||
|
||||
In order to remove noise and get only FAIL findings there is a `-q` flag that makes Prowler to show and log only FAILs. It can be combined with any other option.
|
||||
|
||||
```
|
||||
```sh
|
||||
./prowler -q -M csv -b
|
||||
```
|
||||
|
||||
## Security Hub integration
|
||||
|
||||
Since version v2.3, Prowler supports natively sending findings to [AWS Security Hub](https://aws.amazon.com/security-hub). This integration allows Prowler to import its findings to AWS Security Hub. With Security Hub, you now have a single place that aggregates, organizes, and prioritizes your security alerts, or findings, from multiple AWS services, such as Amazon GuardDuty, Amazon Inspector, Amazon Macie, AWS Identity and Access Management (IAM) Access Analyzer, and AWS Firewall Manager, as well as from AWS Partner solutions and now from Prowler. It is as simple as running the commanbd below:
|
||||
Since version v2.3, Prowler supports natively sending findings to [AWS Security Hub](https://aws.amazon.com/security-hub). This integration allows Prowler to import its findings to AWS Security Hub. With Security Hub, you now have a single place that aggregates, organizes, and prioritizes your security alerts, or findings, from multiple AWS services, such as Amazon GuardDuty, Amazon Inspector, Amazon Macie, AWS Identity and Access Management (IAM) Access Analyzer, and AWS Firewall Manager, as well as from AWS Partner solutions and now from Prowler. It is as simple as running the command below:
|
||||
|
||||
```sh
|
||||
./prowler -M json-asff -S
|
||||
```
|
||||
|
||||
```
|
||||
./prowler -M json-asff -S
|
||||
```
|
||||
There are two requirements:
|
||||
|
||||
1. Security Hub must be enabled for the active region from where you are calling Prowler (if no region is used with `-r` then `us-east-1` is used). It can be enabled by calling `aws securityhub enable-security-hub`
|
||||
@@ -323,7 +344,6 @@ There are two requirements:
|
||||
|
||||
>Note: to have updated findings in Security Hub you have to run Prowler periodically. Once a day or every certain amount of hours.
|
||||
|
||||
|
||||
## How to fix every FAIL
|
||||
|
||||
Check your report and fix the issues following all specific guidelines per check in <https://d0.awsstatic.com/whitepapers/compliance/AWS_CIS_Foundations_Benchmark.pdf>
|
||||
@@ -344,7 +364,7 @@ Check your report and fix the issues following all specific guidelines per check
|
||||
|
||||
If you are using an STS token for AWS-CLI and your session is expired you probably get this error:
|
||||
|
||||
```
|
||||
```sh
|
||||
A client error (ExpiredToken) occurred when calling the GenerateCredentialReport operation: The security token included in the request is expired
|
||||
```
|
||||
|
||||
@@ -354,16 +374,19 @@ To fix it, please renew your token by authenticating again to the AWS API, see n
|
||||
|
||||
To run Prowler using a profile that requires MFA you just need to get the session token before hand. Just make sure you use this command:
|
||||
|
||||
```
|
||||
```sh
|
||||
aws --profile <YOUR_AWS_PROFILE> sts get-session-token --duration 129600 --serial-number <ARN_OF_MFA> --token-code <MFA_TOKEN_CODE> --output text
|
||||
```
|
||||
Once you get your token you can export it as environment variable:
|
||||
```
|
||||
|
||||
Once you get your token you can export it as environment variable:
|
||||
|
||||
```sh
|
||||
export AWS_PROFILE=YOUR_AWS_PROFILE
|
||||
export AWS_SESSION_TOKEN=YOUR_NEW_TOKEN
|
||||
AWS_SECRET_ACCESS_KEY=YOUR_SECRET
|
||||
export AWS_ACCESS_KEY_ID=YOUR_KEY
|
||||
```
|
||||
|
||||
or set manually up your `~/.aws/credentials` file properly.
|
||||
|
||||
There are some helpfull tools to save time in this process like [aws-mfa-script](https://github.com/asagage/aws-mfa-script) or [aws-cli-mfa](https://github.com/sweharris/aws-cli-mfa).
|
||||
@@ -383,11 +406,13 @@ There are some helpfull tools to save time in this process like [aws-mfa-script]
|
||||
[Prowler-Additions-Policy](iam/prowler-additions-policy.json)
|
||||
|
||||
Some new and specific checks require Prowler to inherit more permissions than SecurityAudit and ViewOnlyAccess to work properly. In addition to the AWS managed policies, "SecurityAudit" and "ViewOnlyAccess", the user/role you use for checks may need to be granted a custom policy with a few more read-only permissions (to support additional services mostly). Here is an example policy with the additional rights, "Prowler-Additions-Policy" (see below bootstrap script for set it up):
|
||||
|
||||
- [iam/prowler-additions-policy.json](iam/prowler-additions-policy.json)
|
||||
|
||||
[Prowler-Security-Hub Policy](iam/prowler-security-hub.json)
|
||||
|
||||
Allows Prowler to import its findings to [AWS Security Hub](https://aws.amazon.com/security-hub). More information in [Security Hub integration](#security-hub-integration):
|
||||
|
||||
- [iam/prowler-security-hub.json](iam/prowler-security-hub.json)
|
||||
|
||||
### Bootstrap Script
|
||||
@@ -418,7 +443,7 @@ Some of these checks look for publicly facing resources may not actually be full
|
||||
|
||||
To list all existing checks please run the command below:
|
||||
|
||||
```
|
||||
```sh
|
||||
./prowler -l
|
||||
```
|
||||
|
||||
@@ -474,6 +499,7 @@ With this group of checks, Prowler shows results of controls related to the "Sec
|
||||
More information on the original PR is [here](https://github.com/toniblyx/prowler/issues/227).
|
||||
|
||||
### Note on Business Associate Addendum's (BAA)
|
||||
|
||||
Under the HIPAA regulations, cloud service providers (CSPs) such as AWS are considered business associates. The Business Associate Addendum (BAA) is an AWS contract that is required under HIPAA rules to ensure that AWS appropriately safeguards protected health information (PHI). The BAA also serves to clarify and limit, as appropriate, the permissible uses and disclosures of PHI by AWS, based on the relationship between AWS and our customers, and the activities or services being performed by AWS. Customers may use any AWS service in an account designated as a HIPAA account, but they should only process, store, and transmit protected health information (PHI) in the HIPAA-eligible services defined in the Business Associate Addendum (BAA). For the latest list of HIPAA-eligible AWS services, see [HIPAA Eligible Services Reference](https://aws.amazon.com/compliance/hipaa-eligible-services-reference/).
|
||||
|
||||
More information on AWS & HIPAA can be found [here](https://aws.amazon.com/compliance/hipaa-compliance/)
|
||||
@@ -489,7 +515,9 @@ The `hipaa` group of checks uses existing and extra checks. To get a HIPAA repor
|
||||
```
|
||||
|
||||
## Trust Boundaries Checks
|
||||
|
||||
### Definition and Terms
|
||||
|
||||
The term "trust boundary" is originating from the threat modelling process and the most popular contributor Adam Shostack and author of "Threat Modeling: Designing for Security" defines it as following ([reference](https://adam.shostack.org/uncover.html)):
|
||||
|
||||
> Trust boundaries are perhaps the most subjective of all: these represent the border between trusted and untrusted elements. Trust is complex. You might trust your mechanic with your car, your dentist with your teeth, and your banker with your money, but you probably don't trust your dentist to change your spark plugs.
|
||||
@@ -499,16 +527,22 @@ AWS is made to be flexible for service links within and between different AWS ac
|
||||
This group of checks helps to analyse a particular AWS account (subject) on existing links to other AWS accounts across various AWS services, in order to identify untrusted links.
|
||||
|
||||
### Run
|
||||
|
||||
To give it a quick shot just call:
|
||||
|
||||
```sh
|
||||
./prowler -g trustboundaries
|
||||
```
|
||||
|
||||
### Scenarios
|
||||
|
||||
Currently this check group supports two different scenarios:
|
||||
1. Single account environment: no action required, the configuration is happening automatically for you.
|
||||
2. Multi account environment: in case you environment has multiple trusted and known AWS accounts you maybe want to append them manually to [groups/group16_trustboundaries](groups/group16_trustboundaries) as a space separated list into `GROUP_TRUSTBOUNDARIES_TRUSTED_ACCOUNT_IDS` variable, then just run prowler.
|
||||
|
||||
1. Single account environment: no action required, the configuration is happening automatically for you.
|
||||
2. Multi account environment: in case you environment has multiple trusted and known AWS accounts you maybe want to append them manually to [groups/group16_trustboundaries](groups/group16_trustboundaries) as a space separated list into `GROUP_TRUSTBOUNDARIES_TRUSTED_ACCOUNT_IDS` variable, then just run prowler.
|
||||
|
||||
### Coverage
|
||||
|
||||
Current coverage of Amazon Web Service (AWS) taken from [here](https://docs.aws.amazon.com/whitepapers/latest/aws-overview/introduction.html):
|
||||
| Topic | Service | Trust Boundary |
|
||||
|---------------------------------|------------|---------------------------------------------------------------------------|
|
||||
@@ -518,6 +552,7 @@ Current coverage of Amazon Web Service (AWS) taken from [here](https://docs.aws.
|
||||
All ideas or recommendations to extend this group are very welcome [here](https://github.com/toniblyx/prowler/issues/new/choose).
|
||||
|
||||
### Detailed Explanation of the Concept
|
||||
|
||||
The diagrams depict two common scenarios, single account and multi account environments.
|
||||
Every circle represents one AWS account.
|
||||
The dashed line represents the trust boundary, that separates trust and untrusted AWS accounts.
|
||||
|
||||
89
include/junit_integration
Normal file
89
include/junit_integration
Normal file
@@ -0,0 +1,89 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
# Prowler - the handy cloud security tool (copyright 2018) by Toni de la Fuente
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
|
||||
# use this file except in compliance with the License. You may obtain a copy
|
||||
# of the License at http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software distributed
|
||||
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
|
||||
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations under the License.
|
||||
|
||||
# Generates JUnit XML reports which can be read by Jenkins or other CI tools
|
||||
|
||||
JUNIT_OUTPUT_DIRECTORY="junit-reports"
|
||||
|
||||
xml_escape() {
|
||||
sed 's/&/\&/g; s/</\</g; s/>/\>/g; s/\"/\"/g; s/'"'"'/\'/g' <<< "$1"
|
||||
}
|
||||
|
||||
prepare_junit_output() {
|
||||
# Remove any JUnit output from previous runs
|
||||
rm -rf "$JUNIT_OUTPUT_DIRECTORY"
|
||||
mkdir "$JUNIT_OUTPUT_DIRECTORY"
|
||||
echo ""
|
||||
echo "$NOTICE Writing JUnit XML reports to $PROWLER_DIR/$JUNIT_OUTPUT_DIRECTORY $NORMAL"
|
||||
}
|
||||
|
||||
prepare_junit_check_output() {
|
||||
# JUnit test cases must be named uniquely, but each Prowler check can output many times due to multiple resources,
|
||||
# therefore append an index value to the test case name to provide uniqueness, reset it to 1 before starting this check
|
||||
JUNIT_CHECK_INDEX=1
|
||||
# To match JUnit behaviour in Java, and ensure that an aborted execution does not leave a partially written and therefore invalid XML file,
|
||||
# output a JUnit XML file per check
|
||||
JUNIT_OUTPUT_FILE="$JUNIT_OUTPUT_DIRECTORY/$1.xml"
|
||||
printf '%s\n' \
|
||||
"<?xml version=\"1.0\" encoding=\"UTF-8\"?>" \
|
||||
"<testsuite name=\"$(xml_escape "$(get_junit_classname)")\" timestamp=\"$(get_iso8601_timestamp)\">" \
|
||||
" <properties>" \
|
||||
" <property name=\"prowler.version\" value=\"$(xml_escape "$PROWLER_VERSION")\"/>" \
|
||||
" <property name=\"aws.profile\" value=\"$(xml_escape "$PROFILE")\"/>" \
|
||||
" <property name=\"aws.accountNumber\" value=\"$(xml_escape "$ACCOUNT_NUM")\"/>" \
|
||||
" <property name=\"check.id\" value=\"$(xml_escape "$TITLE_ID")\"/>" \
|
||||
" <property name=\"check.scored\" value=\"$(xml_escape "$ITEM_SCORED")\"/>" \
|
||||
" <property name=\"check.level\" value=\"$(xml_escape "$ITEM_LEVEL")\"/>" \
|
||||
" <property name=\"check.asff.type\" value=\"$(xml_escape "$ASFF_TYPE")\"/>" \
|
||||
" <property name=\"check.asff.resourceType\" value=\"$(xml_escape "$ASFF_RESOURCE_TYPE")\"/>" \
|
||||
" </properties>" \
|
||||
> "$JUNIT_OUTPUT_FILE"
|
||||
JUNIT_CHECK_START_TIME=$(get_time_in_milliseconds)
|
||||
}
|
||||
|
||||
finalise_junit_check_output() {
|
||||
echo '</testsuite>' >> "$JUNIT_OUTPUT_FILE"
|
||||
}
|
||||
|
||||
output_junit_success() {
|
||||
output_junit_test_case "$1" "<system-out>$(xml_escape "$1")</system-out>"
|
||||
}
|
||||
|
||||
output_junit_info() {
|
||||
# Nothing to output for JUnit for this level of message, but reset the check timer for timing the next check
|
||||
JUNIT_CHECK_START_TIME=$(get_time_in_milliseconds)
|
||||
}
|
||||
|
||||
output_junit_failure() {
|
||||
output_junit_test_case "$1" "<failure message=\"$(xml_escape "$1")\"/>"
|
||||
}
|
||||
|
||||
get_junit_classname() {
|
||||
# <section>.<check_id> naturally follows a Java package structure, so it is suitable as a package name
|
||||
echo "$TITLE_ID"
|
||||
}
|
||||
|
||||
output_junit_test_case() {
|
||||
local time_now
|
||||
local test_case_duration
|
||||
time_now=$(get_time_in_milliseconds)
|
||||
# JUnit test case time values are in seconds, so divide by 1000 using e-3 to convert from milliseconds without losing accuracy due to non-floating point arithmetic
|
||||
test_case_duration=$(printf "%.3f" "$(("$time_now" - "$JUNIT_CHECK_START_TIME"))e-3")
|
||||
printf '%s\n' \
|
||||
" <testcase name=\"$(xml_escape "$TITLE_TEXT") ($JUNIT_CHECK_INDEX)\" classname=\"$(xml_escape "$(get_junit_classname)")\" time=\"$test_case_duration\">" \
|
||||
" $2" \
|
||||
" </testcase>" >> "$JUNIT_OUTPUT_FILE"
|
||||
# Reset the check timer for timing the next check
|
||||
JUNIT_CHECK_START_TIME=$(get_time_in_milliseconds)
|
||||
((JUNIT_CHECK_INDEX+=1))
|
||||
}
|
||||
@@ -11,17 +11,19 @@
|
||||
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations under the License.
|
||||
|
||||
DATE_CMD="date"
|
||||
|
||||
gnu_how_older_from_today() {
|
||||
DATE_TO_COMPARE=$1
|
||||
TODAY_IN_DAYS=$(date -d "$(date +%Y-%m-%d)" +%s)
|
||||
DATE_FROM_IN_DAYS=$(date -d $DATE_TO_COMPARE +%s)
|
||||
TODAY_IN_DAYS=$("$DATE_CMD" -d "$("$DATE_CMD" +%Y-%m-%d)" +%s)
|
||||
DATE_FROM_IN_DAYS=$("$DATE_CMD" -d $DATE_TO_COMPARE +%s)
|
||||
DAYS_SINCE=$((($TODAY_IN_DAYS - $DATE_FROM_IN_DAYS )/60/60/24))
|
||||
echo $DAYS_SINCE
|
||||
}
|
||||
bsd_how_older_from_today() {
|
||||
DATE_TO_COMPARE=$1
|
||||
TODAY_IN_DAYS=$(date +%s)
|
||||
DATE_FROM_IN_DAYS=$(date -jf %Y-%m-%d $DATE_TO_COMPARE +%s)
|
||||
TODAY_IN_DAYS=$("$DATE_CMD" +%s)
|
||||
DATE_FROM_IN_DAYS=$("$DATE_CMD" -jf %Y-%m-%d $DATE_TO_COMPARE +%s)
|
||||
DAYS_SINCE=$((($TODAY_IN_DAYS - $DATE_FROM_IN_DAYS )/60/60/24))
|
||||
echo $DAYS_SINCE
|
||||
}
|
||||
@@ -31,13 +33,13 @@ bsd_how_older_from_today() {
|
||||
gnu_timestamp_to_date() {
|
||||
# remove fractions of a second
|
||||
TIMESTAMP_TO_CONVERT=$(echo $1 | cut -f1 -d".")
|
||||
OUTPUT_DATE=$(date -d @$TIMESTAMP_TO_CONVERT +'%Y-%m-%d')
|
||||
OUTPUT_DATE=$("$DATE_CMD" -d @$TIMESTAMP_TO_CONVERT +'%Y-%m-%d')
|
||||
echo $OUTPUT_DATE
|
||||
}
|
||||
bsd_timestamp_to_date() {
|
||||
# remove fractions of a second
|
||||
TIMESTAMP_TO_CONVERT=$(echo $1 | cut -f1 -d".")
|
||||
OUTPUT_DATE=$(date -r $TIMESTAMP_TO_CONVERT +'%Y-%m-%d')
|
||||
OUTPUT_DATE=$("$DATE_CMD" -r $TIMESTAMP_TO_CONVERT +'%Y-%m-%d')
|
||||
echo $OUTPUT_DATE
|
||||
}
|
||||
|
||||
@@ -50,15 +52,15 @@ bsd_decode_report() {
|
||||
|
||||
gnu_how_many_days_from_today() {
|
||||
DATE_TO_COMPARE=$1
|
||||
TODAY_IN_DAYS=$(date -d "$(date +%Y-%m-%d)" +%s)
|
||||
DATE_IN_DAYS=$(date -d $DATE_TO_COMPARE +%s)
|
||||
TODAY_IN_DAYS=$("$DATE_CMD" -d "$("$DATE_CMD" +%Y-%m-%d)" +%s)
|
||||
DATE_IN_DAYS=$("$DATE_CMD" -d $DATE_TO_COMPARE +%s)
|
||||
DAYS_TO=$((( $DATE_IN_DAYS - $TODAY_IN_DAYS )/60/60/24))
|
||||
echo $DAYS_TO
|
||||
}
|
||||
bsd_how_many_days_from_today() {
|
||||
DATE_TO_COMPARE=$1
|
||||
TODAY_IN_DAYS=$(date +%s)
|
||||
DATE_IN_DAYS=$(date -jf %Y-%m-%d $DATE_TO_COMPARE +%s)
|
||||
TODAY_IN_DAYS=$("$DATE_CMD" +%s)
|
||||
DATE_IN_DAYS=$("$DATE_CMD" -jf %Y-%m-%d $DATE_TO_COMPARE +%s)
|
||||
DAYS_TO=$((( $DATE_IN_DAYS - $TODAY_IN_DAYS )/60/60/24))
|
||||
echo $DAYS_TO
|
||||
}
|
||||
@@ -66,17 +68,32 @@ bsd_how_many_days_from_today() {
|
||||
gnu_get_date_previous_than_months() {
|
||||
MONTHS_TO_COMPARE=$1
|
||||
MONTHS_TO_COMPARE_IN_SECONDS=$(( 60 * 60 * 24 * 31 * $MONTHS_TO_COMPARE ))
|
||||
CURRENTSECS=$(date +%s)
|
||||
CURRENTSECS=$("$DATE_CMD" +%s)
|
||||
STARTDATEINSECS=$(( $CURRENTSECS - $MONTHS_TO_COMPARE_IN_SECONDS ))
|
||||
DATE_BEFORE_MONTHS_TO_COMPARE=$(date -d @$STARTDATEINSECS '+%Y-%m-%d')
|
||||
DATE_BEFORE_MONTHS_TO_COMPARE=$("$DATE_CMD" -d @$STARTDATEINSECS '+%Y-%m-%d')
|
||||
echo $DATE_BEFORE_MONTHS_TO_COMPARE
|
||||
}
|
||||
bsd_get_date_previous_than_months() {
|
||||
MONTHS_TO_COMPARE=$1
|
||||
DATE_BEFORE_MONTHS_TO_COMPARE=$(date -v -$(echo $MONTHS_TO_COMPARE)m '+%Y-%m-%d')
|
||||
DATE_BEFORE_MONTHS_TO_COMPARE=$("$DATE_CMD" -v -$(echo $MONTHS_TO_COMPARE)m '+%Y-%m-%d')
|
||||
echo $DATE_BEFORE_MONTHS_TO_COMPARE
|
||||
}
|
||||
|
||||
gnu_get_time_in_milliseconds() {
|
||||
"$DATE_CMD" +%s%3N
|
||||
}
|
||||
bsd_get_time_in_milliseconds() {
|
||||
# BSD date does not support outputting milliseconds, so pad with zeros
|
||||
"$DATE_CMD" +%s000
|
||||
}
|
||||
|
||||
gnu_get_iso8601_timestamp() {
|
||||
"$DATE_CMD" -u +"%Y-%m-%dT%H:%M:%SZ"
|
||||
}
|
||||
bsd_get_iso8601_timestamp() {
|
||||
"$DATE_CMD" -u +"%Y-%m-%dT%H:%M:%SZ"
|
||||
}
|
||||
|
||||
gnu_test_tcp_connectivity() {
|
||||
HOST=$1
|
||||
PORT=$2
|
||||
@@ -114,16 +131,28 @@ if [ "$OSTYPE" == "linux-gnu" ] || [ "$OSTYPE" == "linux-musl" ]; then
|
||||
get_date_previous_than_months() {
|
||||
gnu_get_date_previous_than_months "$1"
|
||||
}
|
||||
get_time_in_milliseconds() {
|
||||
gnu_get_time_in_milliseconds
|
||||
}
|
||||
get_iso8601_timestamp() {
|
||||
gnu_get_iso8601_timestamp
|
||||
}
|
||||
test_tcp_connectivity() {
|
||||
gnu_test_tcp_connectivity "$1" "$2" "$3"
|
||||
}
|
||||
elif [[ "$OSTYPE" == "darwin"* ]]; then
|
||||
# BSD/OSX commands compatibility
|
||||
TEMP_REPORT_FILE=$(mktemp -t prowler.cred_report-XXXXXX)
|
||||
# It is possible that the user has installed GNU coreutils, replacing the default Mac OS X BSD tools with
|
||||
# GNU coreutils equivalents. Only GNU date allows --version as a valid argument, so use the validity of this argument
|
||||
# It is possible that the user has installed GNU coreutils on OS X. By default, this will make GNU commands
|
||||
# available with a 'g' prefix, e.g. 'gdate'. Test if this is present, and use it if so, as it supports more features.
|
||||
# The user also may have replaced the default Mac OS X BSD tools with the GNU coreutils equivalents.
|
||||
# Only GNU date allows --version as a valid argument, so use the validity of this argument
|
||||
# as a means to detect that coreutils is installed and is overriding the default tools
|
||||
if date --version >/dev/null 2>&1 ; then
|
||||
GDATE=$(which gdate)
|
||||
if [ -n "${GDATE}" ]; then
|
||||
DATE_CMD="gdate"
|
||||
fi
|
||||
if "$DATE_CMD" --version >/dev/null 2>&1 ; then
|
||||
how_older_from_today() {
|
||||
gnu_how_older_from_today "$1"
|
||||
}
|
||||
@@ -139,6 +168,12 @@ elif [[ "$OSTYPE" == "darwin"* ]]; then
|
||||
get_date_previous_than_months() {
|
||||
gnu_get_date_previous_than_months "$1"
|
||||
}
|
||||
get_time_in_milliseconds() {
|
||||
gnu_get_time_in_milliseconds
|
||||
}
|
||||
get_iso8601_timestamp() {
|
||||
gnu_get_iso8601_timestamp
|
||||
}
|
||||
else
|
||||
how_older_from_today() {
|
||||
bsd_how_older_from_today "$1"
|
||||
@@ -155,6 +190,12 @@ elif [[ "$OSTYPE" == "darwin"* ]]; then
|
||||
get_date_previous_than_months() {
|
||||
bsd_get_date_previous_than_months "$1"
|
||||
}
|
||||
get_time_in_milliseconds() {
|
||||
bsd_get_time_in_milliseconds
|
||||
}
|
||||
get_iso8601_timestamp() {
|
||||
bsd_get_iso8601_timestamp
|
||||
}
|
||||
fi
|
||||
test_tcp_connectivity() {
|
||||
bsd_test_tcp_connectivity "$1" "$2" "$3"
|
||||
@@ -177,6 +218,12 @@ elif [[ "$OSTYPE" == "cygwin" ]]; then
|
||||
get_date_previous_than_months() {
|
||||
gnu_get_date_previous_than_months "$1"
|
||||
}
|
||||
get_time_in_milliseconds() {
|
||||
gnu_get_time_in_milliseconds
|
||||
}
|
||||
get_iso8601_timestamp() {
|
||||
gnu_get_iso8601_timestamp
|
||||
}
|
||||
test_tcp_connectivity() {
|
||||
gnu_test_tcp_connectivity "$1" "$2" "$3"
|
||||
}
|
||||
|
||||
@@ -27,6 +27,9 @@ textPass(){
|
||||
fi
|
||||
|
||||
PASS_COUNTER=$((PASS_COUNTER+1))
|
||||
if [[ "${GENERATE_JUNIT}" -eq 1 ]]; then
|
||||
output_junit_success "$1"
|
||||
fi
|
||||
if [[ "${MODES[@]}" =~ "csv" || "${MODES[@]}" =~ "json" || "${MODES[@]}" =~ "json-asff" ]]; then
|
||||
if [[ $2 ]]; then
|
||||
REPREGION=$2
|
||||
@@ -56,6 +59,9 @@ textInfo(){
|
||||
return
|
||||
fi
|
||||
|
||||
if [[ "${GENERATE_JUNIT}" -eq 1 ]]; then
|
||||
output_junit_info "$1"
|
||||
fi
|
||||
if [[ "${MODES[@]}" =~ "csv" || "${MODES[@]}" =~ "json" || "${MODES[@]}" =~ "json-asff" ]]; then
|
||||
if [[ $2 ]]; then
|
||||
REPREGION=$2
|
||||
@@ -76,6 +82,9 @@ textInfo(){
|
||||
textFail(){
|
||||
FAIL_COUNTER=$((FAIL_COUNTER+1))
|
||||
EXITCODE=3
|
||||
if [[ "${GENERATE_JUNIT}" -eq 1 ]]; then
|
||||
output_junit_failure "$1"
|
||||
fi
|
||||
if [[ "${MODES[@]}" =~ "csv" || "${MODES[@]}" =~ "json" || "${MODES[@]}" =~ "json-asff" ]]; then
|
||||
if [[ $2 ]]; then
|
||||
REPREGION=$2
|
||||
@@ -156,7 +165,7 @@ generateJsonOutput(){
|
||||
--arg ITEM_LEVEL "$ITEM_LEVEL" \
|
||||
--arg TITLE_ID "$TITLE_ID" \
|
||||
--arg REPREGION "$REPREGION" \
|
||||
--arg TIMESTAMP $(date -u +"%Y-%m-%dT%H:%M:%SZ") \
|
||||
--arg TIMESTAMP "$(get_iso8601_timestamp)" \
|
||||
-n '{
|
||||
"Profile": $PROFILE,
|
||||
"Account Number": $ACCOUNT_NUM,
|
||||
@@ -178,20 +187,17 @@ generateJsonAsffOutput(){
|
||||
local status=$2
|
||||
local severity=$3
|
||||
jq -M -c \
|
||||
--arg PROFILE "$PROFILE" \
|
||||
--arg ACCOUNT_NUM "$ACCOUNT_NUM" \
|
||||
--arg TITLE_TEXT "$TITLE_TEXT" \
|
||||
--arg MESSAGE "$(echo -e "${message}" | sed -e 's/^[[:space:]]*//')" \
|
||||
--arg UNIQUE_ID "$(LC_ALL=C echo -e "${message}" | tr -cs '[:alnum:]._~-\n' '_')" \
|
||||
--arg STATUS "$status" \
|
||||
--arg SEVERITY "$severity" \
|
||||
--arg SCORED "$ITEM_SCORED" \
|
||||
--arg ITEM_LEVEL "$ITEM_LEVEL" \
|
||||
--arg TITLE_ID "$TITLE_ID" \
|
||||
--arg TYPE "$ASFF_TYPE" \
|
||||
--arg RESOURCE_TYPE "$ASFF_RESOURCE_TYPE" \
|
||||
--arg REPREGION "$REPREGION" \
|
||||
--arg TIMESTAMP $(date -u +"%Y-%m-%dT%H:%M:%SZ") \
|
||||
--arg TIMESTAMP "$(get_iso8601_timestamp)" \
|
||||
--arg PROWLER_VERSION "$PROWLER_VERSION" \
|
||||
-n '{
|
||||
"SchemaVersion": "2018-10-08",
|
||||
|
||||
32
prowler
32
prowler
@@ -45,6 +45,7 @@ SEP=','
|
||||
KEEPCREDREPORT=0
|
||||
EXITCODE=0
|
||||
SEND_TO_SECURITY_HUB=0
|
||||
GENERATE_JUNIT=0
|
||||
SCRIPT_START_TIME=$( date -u +"%Y-%m-%dT%H:%M:%S%z" )
|
||||
TITLE_ID=""
|
||||
TITLE_TEXT="CALLER ERROR - UNSET TITLE"
|
||||
@@ -78,21 +79,22 @@ USAGE:
|
||||
-V show version number & exit
|
||||
-s show scoring report
|
||||
-S send check output to AWS Security Hub - only valid when the output mode is json-asff (i.e. "-M json-asff -S")
|
||||
-J generate JUnit reports, readable by Jenkins or other CI tools. Files are written to ./junit-reports
|
||||
-x specify external directory with custom checks (i.e. /my/own/checks, files must start by "check")
|
||||
-q suppress info messages and passing test output
|
||||
-A account id for the account where to assume a role, requires -R and -T
|
||||
(i.e.: 123456789012)
|
||||
-R role name to assume in the account, requires -A and -T
|
||||
(i.e.: ProwlerRole)
|
||||
-T session durantion given to that role credentials in seconds, default 1h (3600) recommended 12h, requires -R and -T
|
||||
-T session duration given to that role credentials in seconds, default 1h (3600) recommended 12h, requires -R and -T
|
||||
(i.e.: 43200)
|
||||
-I External ID to be used when assuming roles (no mandatory), requires -A and -R.
|
||||
-I External ID to be used when assuming roles (not mandatory), requires -A and -R.
|
||||
-h this help
|
||||
"
|
||||
exit
|
||||
}
|
||||
|
||||
while getopts ":hlLkqp:r:c:g:f:m:M:E:enbVsSxI:A:R:T:" OPTION; do
|
||||
while getopts ":hlLkqp:r:c:g:f:m:M:E:enbVsSJxI:A:R:T:" OPTION; do
|
||||
case $OPTION in
|
||||
h )
|
||||
usage
|
||||
@@ -152,6 +154,9 @@ while getopts ":hlLkqp:r:c:g:f:m:M:E:enbVsSxI:A:R:T:" OPTION; do
|
||||
S )
|
||||
SEND_TO_SECURITY_HUB=1
|
||||
;;
|
||||
J )
|
||||
GENERATE_JUNIT=1
|
||||
;;
|
||||
x )
|
||||
EXTERNAL_CHECKS_PATH=$OPTARG
|
||||
;;
|
||||
@@ -206,6 +211,7 @@ trap "{ rm -f /tmp/prowler*.policy.*; }" EXIT
|
||||
. $PROWLER_DIR/include/assume_role
|
||||
. $PROWLER_DIR/include/connection_tests
|
||||
. $PROWLER_DIR/include/securityhub_integration
|
||||
. $PROWLER_DIR/include/junit_integration
|
||||
|
||||
# Get a list of all available AWS Regions
|
||||
REGIONS=$($AWSCLI ec2 describe-regions --query 'Regions[].RegionName' \
|
||||
@@ -274,7 +280,14 @@ execute_check() {
|
||||
fi
|
||||
fi
|
||||
show_check_title ${alternate_name}
|
||||
if [[ "${GENERATE_JUNIT}" -eq 1 ]]; then
|
||||
prepare_junit_check_output "$1"
|
||||
fi
|
||||
# Execute the check
|
||||
${alternate_name}
|
||||
if [[ "${GENERATE_JUNIT}" -eq 1 ]]; then
|
||||
finalise_junit_check_output "$1"
|
||||
fi
|
||||
else
|
||||
# Check to see if this is a real check
|
||||
local check_id_var=CHECK_ID_$1
|
||||
@@ -287,7 +300,14 @@ execute_check() {
|
||||
fi
|
||||
fi
|
||||
show_check_title $1
|
||||
if [[ "${GENERATE_JUNIT}" -eq 1 ]]; then
|
||||
prepare_junit_check_output "$1"
|
||||
fi
|
||||
# Execute the check
|
||||
$1
|
||||
if [[ "${GENERATE_JUNIT}" -eq 1 ]]; then
|
||||
finalise_junit_check_output "$1"
|
||||
fi
|
||||
else
|
||||
textFail "ERROR! Use a valid check name (i.e. check41 or extra71)";
|
||||
exit $EXITCODE
|
||||
@@ -415,7 +435,7 @@ if [[ $PRINTGROUPSONLY == "1" ]]; then
|
||||
fi
|
||||
|
||||
# Check that jq is installed for JSON outputs
|
||||
if [[ "$MODE" == "json" || "$MODE" == "json-asff" ]]; then
|
||||
if [[ ${MODES[@]} =~ "json" || ${MODES[@]} =~ "json-asff" ]]; then
|
||||
. $PROWLER_DIR/include/jq_detector
|
||||
fi
|
||||
|
||||
@@ -423,6 +443,10 @@ if [[ "$SEND_TO_SECURITY_HUB" -eq 1 ]]; then
|
||||
checkSecurityHubCompatibility
|
||||
fi
|
||||
|
||||
if [[ "${GENERATE_JUNIT}" -eq 1 ]]; then
|
||||
prepare_junit_output
|
||||
fi
|
||||
|
||||
# Gather account data / test aws cli connectivity
|
||||
getWhoami
|
||||
|
||||
|
||||
Reference in New Issue
Block a user