I’m researching how engineers assess the security posture of their Cloud deployments and evaluate risk to those deployments so they can improve understand and improve the Cloud deployment’s risk position.

I identified, evaluated, and used the top free security assessment tools for AWS:

  • ScoutSuite by NCC Group
  • CloudMapper by Duo Labs
  • Prowler by Toni de la Fuente
  • AWS Security Hub – CIS AWS Foundations Benchmark
  • AWS Access Analyzer for IAM, S3, and KMS

Each of these tools (excl Access Analyzer) does a fine job in collecting and presenting important information about the security of:

  • Identity and Access Management: configuration of root user, base IAM password and access key policy, roles preferred over users, etc
  • Logging, Monitoring, and Audit configuration with CloudTrail, Config, VPC Flow Logging to detect creation or changesto security critical resources, unauthorized access attempts, and to support forensics
  • Networking: Security Groups (firewalls) do not allow access to ports 22 (ssh) or 3389 (rdp) from everywhere (0.0.0.0/0); also routing tables are “least access”

This covers the CIS AWS Foundations Benchmark v1.2 (pdf). Unsurprisingly, Security Hub’s CIS benchmark is a direct implementations of the CIS benchmark and is scoped to those controls.

AWS Access Analyzer is a new tool filling the niche of helping people understand a problem that is kind of specific to AWS, identifying security policies that grant access to other AWS accounts or the whole world.

ScoutSuite, CloudMapper, and Prowler go further than the CIS benchmark and cover some of Access Analyzer’s territory by:

  1. evaluating more rules to detect potential security configuration issues with IAM principals & policies, access to data sources in S3/RDS/Redshift/etc, network access controls, known bad configurations and probable mistakes
  2. assigning a risk level to findings and organizing the reported results by that level, e.g. critical, high, medium, low, info

I’ve used tools like ScoutSuite and CloudMapper to perform security assessments for clients as well as my team’s own accounts. These tools accurately provide information and a reliable map through the security landscape. However, I find the results require a fair amount of interpretation in order to assess risk.

To be clear, these tools do not promise to precisely and accurately assess your deployment’s risk position — they’re designed to assess the deployment’s security posture. They helpfully provide a head-start on organizing the assessment’s findings according to globally-applicable risk heuristics.

This isn’t surprising considering these tools were born from consulting work. From ScoutSuite’s wiki:

There are two key bits:

ScoutSuite gathers configuration data for manual inspection and highlights risk areas

and

Scout Suite was designed by security consultants/auditors.

Scout Suite promises to assess the security posture of your cloud environment and highlight risk areas. Interpreting the Scout Suite’s findings and assessing risk is up to whomever is reading the report — the Security expert in consultation with the team. This is an observation, not a value judgement against consulting or auditors.

If you’ve used or tried and failed (especially) to assess the security of your Cloud environment so that you could reduce your risk with one of these tools or something else, I’d love to hear your answers to these questions:

  1. What’s the hardest part about assessing and improving the security and risk position of your Cloud deployments?
  2. When was the last time you tried or wanted to assess your Cloud deployment security or reduce its risk position?
  3. How did that go? What problems did you encounter?

On Wednesday, I will share my own answers these questions along with anonymized and aggregated answers from the list. I will also offer some ideas about what some of the fundamental challenges are for assessing cloud security posture with current tools.

The intent of this research is to help you and your team:

  1. use existing tools more effectively
  2. manage expectations for what you’ll be able to do with existing tools
  3. inform the direction of a next-generation tool that is trying to be both easier to use and actionable

Stephen

#NoDrama