Exploiting AWS cloud security misconfigurations is big business, an industry in its own right. Bad actors are snooping, copying, and selling data and ransoming companies 24/7. Organizations and governments are desperate to offer big rewards to expose and arrest these online opportunists. Mistakes, oversights, or ill-informed cloud service configuration choices are at the center of vulnerabilities that make the nightly news every week. To compound the problem, according to research, most environments are operated under a “set it and forget it” policy where reviews and audits are not used to ferret out the original mistakes.
It is vital to employ experienced AWS cloud professionals to architect, engineer, implement, maintain, and audit your cloud environments. Small misconfigurations lead to mountainous liability.
Let Us Configure Your Cloud
Data Exposure Risks
As AWS’ customer list of enterprises that upload and distribute data across the globe grows exponentially, the cost-effective, productivity-enhancing services come with risks of misconfiguration vulnerabilities.
Misconfigured Amazon S3 buckets are the most common security threat to AWS cloud environments. Through poorly configuring this cloud object storage inadvertently, public read access makes data breaches possible, and public write access exposes code for injection of malware or encryption of data to hold a company ransom.
What is Amazon S3?
Amazon Simple Storage Service (S3) is an object storage service. Fifteen years after its introduction, it offers industry-leading scalability, data availability, security, and performance. All organizations in every industry use Amazon S3 to store and protect data for various use cases: data lakes, websites, mobile apps, backup and restore, archive, enterprise applications, IoT devices, and big data analytics.
Recent AWS Cloud Breaches Due to Amazon S3 Misconfigurations
October 2021, Twitch, Amazon.com, Inc.’s live streaming e-sports platform, blamed “an error” in a server configuration change for exposing sensitive information that enabled a hacker to leak data from its source code repository and creator payout information.
August 2021, SeniorAdvisor, a ratings/review website for senior care services in the US and Canada, was found by ethical hackers at WizCase to have a bucket configuration that left over 1 million files (182 GB) of personal contact information from leads and reviews.
Moreover, ethical hackers at WizCase exposed more than a terabyte of data, including 1.6 million files with city residents’ sensitive personal data, building plans, city plans, and local property data. It is unclear whether the Amazon S3 bucket misconfigurations were due to actions by PeopleGIS or the municipalities.
In March 2021, Premier Diagnostics, a Utah-based COVID testing company, exposed patients’ data via improperly configured Amazon S3 buckets. This company disclosed Over 50,000 scan documents of customers’ personal information, including images of driver’s licenses, passports, and medical insurance cards.
How to Minimize Data Exposure Risks in Amazon S3
Encrypt all data while in transit to and from and stored in S3. Use client-side encryption or SSL/TLS protocol.
Enable Bucket Versioning
Versioning adds a layer of data protection against user actions and application failures. With Amazon S3 bucket versioning enabled, AWS simultaneously stores all objects when receiving multiple write requests to the same object.
Enable MFA Delete
Further, strengthen security by enabling Multi-factor Authentication (MFA) Delete for a bucket’s versioning configuration. This MFA Delete setting requires a bucket owner to include the ‘x-amz-mfa’ request header. This is so when requesting to permanently delete an object version or change the bucket’s versioning state. Note that requests that have ‘x-amz-mfa’ are required to include HTTPS. Failure to meet all these requirements will cause the request to fail.
Use Amazon S3 Object Lock
Prevent an object from being overwritten or deleted for a fixed period or indefinitely by using the S3 Object Lock service. It stores objects using a write-once-read-many (WORM) model. By using an Object Lock, you can.
Utilize Multi-Region Application
Multi-Region Application Architecture enables users to create fault-tolerant applications with failover to backup regions. Using S3 Cross-Region replication and Amazon DynamoDB Global Tables, the service asynchronously replicates application data across primary and secondary AWS regions.
Lockdown Public Access to Amazon S3
Use Block Public Access settings to override Amazon S3 permissions to prevent accidental or intentional unauthorized exposure. This will ensure your Amazon S3 is secure. These are helpful settings for administrators to centralize account controls for maximum protection.
All new buckets, objects, and access points are not set to public access by default. But Amazon S3 users can modify policies and permissions that allow public access, potentially exposing sensitive data. Unless a dataset specifically requires read or write access on the internet via a URL to your Amazon S3 bucket, it should not be allowed.
Identify Bucket Policies that Allow Wildcard IDs
Identify Amazon S3 bucket policies that allow a wildcard identity, such as Principal “*” or a wildcard action “*.” This effectively enables users to perform any action in Amazon S3. Also, audit for Amazon S3 bucket access control lists (ACLs) that grant read, write, or full access to “Everyone” or “Any authenticated AWS user.” A bucket policy must grant access only to fixed values or non-wildcarded values to ensure data is non-public.
See AWS detailed instructions on making policies non-public.
Audit IAM Systems
AWS IAM systems are your first line of defense in securing access to sensitive data and your applications. Deploying default AWS IAM settings will jeopardize your organization. Even if you avoid this common pitfall and set up AWS IAM policies that effectively secure a resource, that protection can become outdated. For example, if a user changes departments or roles within a department, access rights should change to match the new role’s need to access data and applications. Regularly audit AWS IAM rules to protect your environment better.
Use Tools to Inspect Implementations
Use tools to reduce human error. More tools will arise over time, but currently, we recommend:
- AWS Trusted Advisor, an online tool that offers real-time guidance and support when provisioning resources on AWS, inspects your Amazon S3 implementations to strengthen your efforts to prevent public access.
- Employ real-time monitoring through the s3-bucket-public-read-prohibited and s3-bucket-public-write-prohibited managed AWS Config Rules.
- Hire DinoCloud to complete a Well-Architected Framework review and remediate your Amazon S3 implementation.
Enforce Least Privilege Access
Add layer protection against unauthorized access to Amazon S3 by enforcing least privilege access. This means granting only permissions to identities (users, roles, services) that are required to perform its tasks. This principle prevents malware, reduces the potential for cyberattacks, aids data classifications, helps demonstrate compliance, and promotes user productivity.
Focus on these best practices:
- Separation of duties – avoid conflicts of interest between people and applications, ensuring that responsibilities and the privileges granted to accomplish them are not to leave the organization open to fraud, theft, circumvention of security controls, or other risks. Failure to separate functionality can result in toxic combinations where privileges can be abused. For example, a user with permission to create an invoice may also have been provided the privilege of paying the invoice.
- Inactive identities – review access privileges for lack of login activity, then ideally remove them, but at a minimum, closely monitor them as bad actors can gain access without the owners’ knowledge.
- Privilege escalation – this occurs when vulnerabilities, often identity and access management (IAM) misconfiguration, are exploited either horizontally by one user using its privileges to access another user’s account or, even more, prone to damage, vertically where a user accesses accounts with higher level privileges such as an account administrator.
AWS tools that aid you in implementing least privilege access:
Is Your Infrastructure
Software Supply Chain Risks
High-profile incidents implicate software supply chains, such as the 2020 US SolarWinds and 2021 Log4 breaches, are rising. Securing the software supply chain is becoming a more-often addressed topic among IT teams, but regrettably, many are not considering the public cloud as part of that chain. It is understandable how this oversight happens as the public cloud is more “infrastructure” than “software.” But, cloud security risks can rain (pun intended) on your software supply chain, causing a storm just as volatile as software applications.
Software supply chain infiltration is lucrative and can be scaled quickly and efficiently. By infiltrating a soft target, then exploiting poor configurations, bad actors can deploy malware in multiple organizations’ environments for later launch. With access to the environments, hackers can evolve, most often undetected.
Public cloud security breaches happen, often exposing data stored in the cloud but not giving hackers direct access to entire IT environments. As proven in the SolarWind breach, it can happen. Consider these reasons to include consideration the public cloud as a part of your software supply chain:
- Clouds are more than just infrastructure: Though Infrastructure-as-a-service (IaaS) may be the primary offering, most cloud vendors also deliver software-as-a-service (SaaS) applications.
- Infrastructure security breaches are painful: Even if your use of the cloud is infrastructure services only, vulnerabilities in the cloud platform can expose your data or applications to malicious assaults.
- Public clouds get hacked: As you have no doubt read in IT publications and sometimes mainstream news, it is possible for hackers to attack your public cloud to be attacked.
You are opening up your software supply chain to vulnerabilities without tracking risks and breaches that impact the cloud environments you use.
Securing the Cloud in Your Software Supply Chain
Reduce risks of using the public cloud as part of a software supply chain:
Know your cloud environment
Unless you know what you have, you cannot monitor it, much less secure it. It is a challenging task that gets more challenging the more prominent the organization, but it is vital. To help organize this effort, enforce tagging rules for your cloud resources to better track workloads and periodically conduct audits to map cloud workloads.
Track your cloud platform(s) security incidents
Once you understand your cloud environment, you will know a list of cloud platforms you must stay well-informed about. For each security incident you become aware of, learn as much as possible about them to know if any of your workloads are impacted. Read IT publications and general news, and follow the security thread of the blog of your cloud provider. Here is the AWS Security Blog
Minimize data exposure
Store fewer data in the public cloud to reduce the impact on your organization and customers if there is a breach. Consider a hybrid cloud architecture.
Spread workloads across multiple cloud accounts
Splitting workloads across different accounts also reduces the impact of any breaches that may occur.
Do not blindly trust third-party containers
If you deploy cloud applications using containers, you know that scanning for vulnerabilities before deployment is standard and best practice. However, many organizations’ environments blindly trust containers from third parties. This situation often happens when you deploy sidecar containers to connect resources like third-party logging services. Generally, a reliable company or open-source project should not cause you to relax your practices of scanning containers. Scan all containers before deployment.
Don’t Be a Victim of AWS Cloud Security MisconfigurationsDinoCloud’s architects and engineers are experienced AWS Partners. Let’s work together to secure your infrastructure while taking advantage of all the benefits of cloud.