Prowling AWS
Snooping Around
Hearing that an external, independent organization has been commissioned to spend time actively attacking the cloud estate you have been tasked with helping to secure can be a little daunting – unless, of course, you are involved with a project at the seminal greenfield stage, and you have yet to learn what goes where and how it all fits together. To add to the complexity, if you are using Amazon Web Services (AWS), AWS Organizations can segregate departmental duties and, therefore, security controls between multiple accounts; commonly this might mean the use of 20 or more accounts. With these concerns and, if you blink a little too slowly, it's quite possible that you will miss a new AWS feature or service that needs to be understood and, once deployed, secured.
Fret not, however, because a few open source tools can help mitigate the pain before an external auditor or penetration tester receives permission to attack your precious cloud infrastructure. In this article, I show you how to install and run the highly sophisticated tool Prowler [1]. With the use of just a handful of its many features, you can test against the industry-consensus benchmarks from the Center for Internet Security (CIS) [2].
What Are You Lookin' At?
When you run Prowler against the overwhelmingly dominant cloud provider AWS, you get the chance to apply an impressive 49 test criteria of the AWS Foundations Benchmark. For some additional context, sections on the AWS Security Blog [3] are worth digging into further.
To bring more to the party, the sophisticated Prowler also stealthily prowls for issues in compliance with General Data Protection Regulation (GDPR) of the European Union and the Health Insurance Portability and Accountability Act (HIPAA) of the United States. Prowler refers to these 40 additional checks as "extras". Table 1 shows the type and number of checks that Prowler can run, and the right-hand column offers the group name you should use to get Prowler to test against specific sets of checks.
Table 1
Checks and Group Names
Description | No./Type of Checks | Group Name |
---|---|---|
Identity and access management | 22 checks | group 1 |
Logging | 9 checks | group 2 |
Monitoring | 14 checks | group 3 |
Networking | 4 checks | group 4 |
Critical priority CIS | CIS Level 1 | cislevel1 |
Critical and high-priority CIS | CIS Level 2 | cislevel2 |
Extras | 39 checks | extras |
Forensics | See README file [4] | forensics-ready |
GDPR | See website [5] | gdpr |
HIPAA | See website [6] | hipaa |
Porch Climbing
To start getting your hands dirty, install Prowler and see what it can do to help improve the visibility of your security issues. To begin, go to the GitHub page [1] held under author Toni de la Fuente's account; he also has a useful blogging site [7] that offers a number of useful insights into the vast landscape of security tools available to users these days and where to find them. I recommend a visit, whatever your level of experience.
The next step is cloning the repository with the git
command [8] (Listing 1). As you can see at the beginning of the command's output, the prowler/
directory will hold the code.
Listing 1
Installing Prowler
$ git clone https://github.com/toniblyx/prowler.git ** Cloning into 'prowler'... remote: Enumerating objects: 50, done. remote: Counting objects: 100% (50/50), done. remote: Compressing objects: 100% (41/41), done. remote: Total 2955 (delta 6), reused 43 (delta 5), pack-reused 2905 Receiving objects: 100% (2955/2955), 971.57 KiB | 915.00 KiB/s, done. Resolving deltas: 100% (1934/1934), done.
The README file recommends installing the ansi2html
and detect-secrets
packages with the pip
Python package installer:
$ pip install awscli ansi2html detect-secrets
If you don't have pip
installed, fret not: Use your package manager. For example, on Debian derivatives, use the apt
command:
$ apt install python-pip
On Red Hat Enterprise Linux child distributions and others like openSUSE or Arch Linux, you can find instructions online [9] for help if you're not sure of the package names.
Now you're just about set to run Prowler from a local machine perspective. Before continuing, however, the other part of the process is configuring the correct AWS Identity and Access Management (IAM) permissions.
An Access Key and a Secret Key attached to a user is needed from AWS, with the correct permissions being made available to the user via a role. Don't worry, though: The permissions aren't giving away the crown jewels but reveal any potential holes in your security posture. Therefore, the results need to be stored somewhere with care, as do all access credentials to AWS.
You might call the List/Read/Describe actions "read-only" if you wanted to summarize succinctly the levels of access required by Prowler. You can either use the SecurityAudit
policy permissions, which is provided by AWS directly, or the custom set of permissions in Listing 2 required by the role to be attached to the user in IAM, which opens up the DescribeTrustedAdvisorChecks
, in addition to those offered by the SecurityAudit
policy, according to the GitHub README file.
Listing 2
Permissions for IAM Role
{ "Version": "2012-10-17", "Statement": [{ "Action": [ "acm:describecertificate", "acm:listcertificates", "apigateway:get", "autoscaling:describe*", "cloudformation:describestack*", "cloudformation:getstackpolicy", "cloudformation:gettemplate", "cloudformation:liststack*", "cloudfront:get*", "cloudfront:list*", "cloudtrail:describetrails", "cloudtrail:geteventselectors", "cloudtrail:gettrailstatus", "cloudtrail:listtags", "cloudwatch:describe*", "codecommit:batchgetrepositories", "codecommit:getbranch", "codecommit:getobjectidentifier", "codecommit:getrepository", "codecommit:list*", "codedeploy:batch*", "codedeploy:get*", "codedeploy:list*", "config:deliver*", "config:describe*", "config:get*", "datapipeline:describeobjects", "datapipeline:describepipelines", "datapipeline:evaluateexpression", "datapipeline:getpipelinedefinition", "datapipeline:listpipelines", "datapipeline:queryobjects", "datapipeline:validatepipelinedefinition", "directconnect:describe*", "dynamodb:listtables", "ec2:describe*", "ecr:describe*", "ecs:describe*", "ecs:list*", "elasticache:describe*", "elasticbeanstalk:describe*", "elasticloadbalancing:describe*", "elasticmapreduce:describejobflows", "elasticmapreduce:listclusters", "es:describeelasticsearchdomainconfig", "es:listdomainnames", "firehose:describe*", "firehose:list*", "glacier:listvaults", "guardduty:listdetectors", "iam:generatecredentialreport", "iam:get*", "iam:list*", "kms:describe*", "kms:get*", "kms:list*", "lambda:getpolicy", "lambda:listfunctions", "logs:DescribeLogGroups", "logs:DescribeMetricFilters", "rds:describe*", "rds:downloaddblogfileportion", "rds:listtagsforresource", "redshift:describe*", "route53:getchange", "route53:getcheckeripranges", "route53:getgeolocation", "route53:gethealthcheck", "route53:gethealthcheckcount", "route53:gethealthchecklastfailurereason", "route53:gethostedzone", "route53:gethostedzonecount", "route53:getreusabledelegationset", "route53:listgeolocations", "route53:listhealthchecks", "route53:listhostedzones", "route53:listhostedzonesbyname", "route53:listqueryloggingconfigs", "route53:listresourcerecordsets", "route53:listreusabledelegationsets", "route53:listtagsforresource", "route53:listtagsforresources", "route53domains:getdomaindetail", "route53domains:getoperationdetail", "route53domains:listdomains", "route53domains:listoperations", "route53domains:listtagsfordomain", "s3:getbucket*", "s3:getlifecycleconfiguration", "s3:getobjectacl", "s3:getobjectversionacl", "s3:listallmybuckets", "sdb:domainmetadata", "sdb:listdomains", "ses:getidentitydkimattributes", "ses:getidentityverificationattributes", "ses:listidentities", "ses:listverifiedemailaddresses", "ses:sendemail", "sns:gettopicattributes", "sns:listsubscriptionsbytopic", "sns:listtopics", "sqs:getqueueattributes", "sqs:listqueues", "support:describetrustedadvisorchecks", "tag:getresources", "tag:gettagkeys" ], "Effect": "Allow", "Resource": "*" }] }
Have a close look at the permissions to make sure you're happy with them. As you can see, a lot of list
and get
actions cover a massive amount of AWS's ever-growing number of services. In a moment, I'll return to this policy after setting up the AWS configuration.
Gate Jumping
For those who aren't familiar with the process of setting up credentials for AWS, I'll zoom through them briefly. The obvious focus will be on Prowler in action.
In the redacted Figure 1, you can see the screen found in the IAM service under the Users | Security credentials tab. Because you're only shown the secret key once, you should click the Create access key button at the bottom and then safely store the details.
To make use of the Access Key and Secret Key you've just generated, return to the terminal and enter:
$ aws configure AWS Access Key ID []:
The aws
command became available when you installed the AWS command-line tool with the pip
package manager.
As you will see from the questions asked by that command, you need to offer a few defaults, such as the preferred AWS region, the output format, and, most importantly, your Access Key and Secret Key, which you can enter with cut and paste. Once you've filled in those details, two files are created in plain text and stored in the ./aws
directory: config
and credentials
. Because these keys are plain text, many developers use environment variables to populate their terminal with these details so they're ephemeral and not saved in a visible format. Wherever you keep them, you should encrypt them when stored – known as "data at rest" in security terms.
Buy this article as PDF
(incl. VAT)
Buy ADMIN Magazine
Subscribe to our ADMIN Newsletters
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Most Popular
Support Our Work
ADMIN content is made possible with support from readers like you. Please consider contributing when you've found an article to be beneficial.