Remedies for Common AWS Cloud Security Misconfigurations

Exploiting cloud security misconfigurations is big business, an industry in its own right. Bad actors are snooping, copying, and selling data and ransoming companies 24/7. Organizations and governments find themselves in desperation offering big rewards to expose and arrest these online opportunists. Mistakes, oversights, or ill-informed cloud service configuration choices are at the center of vulnerabilities that make the nightly news on a weekly basis. To compound the problem, according to research, most environments are operated under a “set it and forget it” policy where reviews and audits are not used to ferret out the original mistakes.

As you will see by the examples and information in this article, it is vital to employ experienced AWS cloud professionals to architect, engineer, implement, maintain, and audit  your cloud environments. Small misconfigurations lead to mountainous liability.

Let Us Configure Your Cloud

Data Exposure Risks

As AWS’ customer list of enterprises that upload and distribute data across the globe grows exponentially, the cost-effective, productivity-enhancing services come with risks of misconfiguration vulnerabilities.  

Misconfigured Amazon S3 buckets are the most common security threat to AWS cloud environments. Through poorly configuring this cloud object storage inadvertently, public read access makes data breaches possible and public write access exposes code for injection of malware or encryption of data to hold a company ransom.

What is Amazon S3?

Amazon Simple Storage Service (S3) is an object storage service. 15 years since its introduction, it offers industry-leading scalability, data availability, security, and performance. All sizes of organizations in every industry use Amazon S3 to store and protect data for a range of use cases:  data lakes, websites, mobile apps, backup and restore, archive, enterprise applications, IoT devices, and big data analytics.

Recent AWS Cloud Breaches Due to Amazon S3 Misconfigurations

October 2021, Twitch, Amazon.com, Inc.’s  live streaming e-sports platform blamed “an error” in a server configuration change for exposing sensitive information that enabled a hacker to leak data from its source code repository and creator payout information.

August 2021, SeniorAdvisor a ratings/review website for senior care services in the US and Canada,  was found by ethical hackers at WizCase to have a bucket configuration that left over 1 million files (182 gb) of personal contact information from leads and reviews.

discovered by ethical hackers at WizCase.  More than a terabyte of data including 1.6 million files with city residents’ sensitive personal data, building plans, city plans, and local property data were exposed. It is unclear whether the  from Amazon S3 bucket misconfigurations were due to actions by PeopleGIS or by the municipalities.

March 2021, Premier Diagnostics, a Utah-based COVID testing company, exposed patients’ personal data via improperly configured Amazon S3 buckets. Over 50,000 scan documents of customers’ personal information were exposed including images of drivers licenses, passports, and medical insurance cards.

How to Minimize Data Exposure Risks in Amazon S3

Data Encryption

Encrypt all data while in transit to and from and stored in S3. Use client-side encryption or SSL/TLS protocol. 

Enable Bucket Versioning 

Versioning adds a layer of data protection against user actions and application failures. With Amazon S3 bucket versioning enabled, AWS simultaneously stores all objects when receiving multiple write requests to the same object.

Enable MFA Delete

Further strengthen security by making enabling Multi-factor Authentication (MFA) Delete for a bucket’s versioning configuration. This MFA Delete setting equires a bucket owner to include the ‘x-amz-mfa’ request header when requesting to permanently delete an object version or change the bucket’s versioning state. Note that requests that include ‘x-amz-mfa’ are required to include HTTPS. Failure to meet all these requirements will cause the request to fail. 

Use Amazon S3 Object Lock

Prevent an object from being overwritten or deleted for a fixed time period or indefinitely by using the S3 Object Lock service. It stores objects using a write-once read-many (WORM) model. By using an Object Lock, you can. 

Utilize Multi-Region Application 

Multi-Region Application Architecture enables users to create fault-tolerant applications with failover to backup regions. Using S3 Cross-Region replication and Amazon DynamoDB Global Tables, the service  asynchronously replicates application data across primary and secondary AWS regions. 

Lockdown Public Access to Amazon S3

All new buckets, objects, and access points by default are not set to public access. But Amazon S3 users can modify policies and permissions that allow public access potentially exposing sensitive data. Unless a dataset specifically requires read or write access on the internet via a URL to your Amazon S3 bucket, it should not be allowed. 

To ensure your Amazon S3 buckets are secure, use Block Public Access settings to override Amazon S3 permissions to prevent accidental or intentional unauthorized exposure. These are useful settings for administrators to centralize account controls for maximum protection..

Identify Bucket Policies that Allow Wildcard IDs

Identify Amazon S3 bucket policies that allow a wildcard identity, such as Principal “*” or that allow a wildcard action “*” that effectively enables users to perform any action in Amaozn S3. Also, audit for Amazon S3 bucket access control lists (ACLs) that grant read, write, or full-access to “Everyone” or “Any authenticated AWS user.” To ensure data is non-public, a bucket policy must grant access only to fixed values or non-wildcarded values.

See AWS detailed instructions on making policies non-public.

Audit IAM Systems

AWS IAM systems are your first line of defense in securing access to sensitive data and your applications. Deploying default AWS IAM settings will jeopardize your organization. Even if you avoid this common pitfall and set up AWS IAM policies that effectively secure a resource, that protection can become outdated overtime. For example, if a user changes departments or roles within a department, access rights should change to match the new role’s need to access data and applications. Regularly audit AWS IAM rules to better protect your environment.

Use Tools to Inspect Implementations 

Use tools to reduce human error. More tools will arise over time, but currently, we recommend: 

  • AWS Trusted Advisor, an online tool that offers real-time guidance and support when provisioning resources on AWS, inspects your Amazon S3 implementations to strengthen your efforts to prevent public access.  
  • Employ real-time monitoring through the s3-bucket-public-read-prohibited and  s3-bucket-public-write-prohibited managed AWS Config Rules. 
  • Hire DinoCloud to complete a Well-Architected Framework review and remediate your Amazon S3 implementation.

Enforce Least Privilege Access 

Add an additional layer of protection against unauthorized access to Amazon S3  by enforcing least privilege access. This means granting only permissions to identities (users, roles, services) that are required to perform its tasks. This principle prevents malware spread, reduces potential for cyberattacks, aids data classifications, helps demonstrate compliance, and promotes user productivity.

Focus on these best practices:

  • Separation of duties – avoid conflicts of interest between people and applications ensuring that responsibilities and the privileges granted to accomplish them to not leave the organization open to fraud, theft, circumvention of security controls, or other risks. Failure to separate functionality can result in toxic combinations where privileges can be abused. For example, a user with permissions to create an invoice may also have been provided the privilege of paying the invoice. 
  • Inactive identities – review access privileges for a lack of login activity then ideally remove them, but at a minimum closely monitor them as bad actors can gain access without the owners’ knowledge.
  • Privilege escalation – this occurs when vulnerabilities, often identity and access management  (IAM) misconfiguration, are exploited either horizontally by one user using its privileges to access another user’s account or, even more prone to damage, vertically where a user accesses accounts with higher level privileges such as an account administrator.  

AWS tools that aid you in implementing least privilege access:

Is Your Infrastructure
Well-Architected™?

Software Supply Chain Risks

High profile incidents implicating software supply chains such as the 2020 US SolarWinds and 2021 Log4 breaches are on the rise. Securing the software supply chain is becoming a more-often addressed topic amont IT teams, but regrettably many are not considering the public cloud to be part of that chain. It is understandable how this oversight happens as the public cloud is more “infrastructure” than “software”. But, cloud security risks can rain (pun intended) on your software supply chain causing a storm just as volatile as software applications.

Software supply chain infiltration is a lucrative business and can be scaled quickly and efficiently. By infiltrating a soft target, then exploiting poor configurations, bad actors can deploy malware in multiple organizations’ environments for later launch. With access to the environments, the hackers can evolve, most often undetected. 

Public cloud security breaches do happen, most often exposing data stored in the cloud, but usually not giving hackers direct access to entire IT environments, but as proven in the SolarWind breach, it can happen. Consider these reasons to include consideration the public cloud as a part of your software supply chain:

  • Clouds are more than just infrastructure: Though Infrastructure-as-a-service (IaaS) may be the main offerings, most cloud vendors deliver software-as-a-service (SaaS) applications, also.
  • Infrastructure security breaches are painful: Even if your use of the cloud is  infrastructure services only, vulnerabilities in the cloud platform can expose your data or applications to malicious assaults.
  • Public clouds get hacked: As you have no doubt read in IT publications and sometimes mainstream news, it is possible for your public cloud to be attacked..

Without tracking risks and breaches that impact the cloud environments you use, you are opening up your software supply chain to vulnerabilities. 

Securing the Cloud in Your Software Supply Chain

Reduce risks of using the public cloud as part of a software supply chain:

Know your cloud environment

Understanding what is used where. Unless you know what you have, you cannot monitor it much less secure it. It is a challenging task that gets more challenging the larger the organization, but it is vital. To help organize this effort enforce tagging rules for your cloud resources to better track workloads and periodically conduct audits to map cloud workloads.

Track your cloud platform(s) security incidents

Once you understand your cloud environment, you are armed with a list of cloud platforms that you must stay well-informed about. For each security incident you become aware of, learn as much as is available about them to know if any of your workloads are impacted. Read IT publications, general news, and  follow the security thread of the blog of your cloud provider. Here is the AWS Security Blog

Minimize data exposure

Store less data in the public cloud to reduce the impact to your organization and customers if there is a breach. Consider a hybrid cloud architecture.

Spread workloads across multiple cloud accounts

Splitting workloads across different accounts also reduces the impact of any breaches that may occur. 

Do not blindly trust third-party containers

If you deploy cloud applications using containers, you know that scanning for vulnerabilities prior to deployment is standard and best practice. However, many organizations’ environments blindly trust containers from third parties. This situation often happens when you deploy sidecar containers to connect resources like third-party logging services. Generally trusting a company or open source project should not cause you to relax your practices of scanning containers. Scan all containers before deployment. 

Don’t Be a Victim of AWS Cloud Security Misconfigurations

DinoCloud’s architects and engineers are experienced AWS Partners. Let’s work together to secure your infrastructure while taking advantage of all the benefits of cloud.

Social Media:

LinkedIn: https://www.linkedin.com/company/dinocloud
Twitter: https://twitter.com/dinocloud_
Instagram: @dinocloud_
Youtube: https://www.youtube.com/c/DinoCloudConsulting

Our HQs

Miami
40 SW 13th St Suite 102, Miami
FL 33130 USA
+1 574 598 4299

New York
67-87 Booth St #2H, Forest
Hills NY 11375
+1 571 322 6769

Colombia
Cra. 19a #103-19Usaquén,
Bogotá 110111,
Colombia

Argentina
Humberto 1° 630, Piso 4, Córdoba
X5000HZQ Argentina
+543513 556000

Get in touch

    (*) Required fields

    Please prove you are human by selecting the Cup.