Data security in AWS – what does it actually mean?
10.05.2021 - Read in 6 min.
Nowadays security plays a major role in different areas of our life. According to conclusions from a vast pool of research, entrepreneurs are becoming increasingly inclined to act quickly in order to be present in the online world. Even if they’re not there yet, they know they need to change it. Their main motivation: securing their business for the future. These tendencies are confirmed by such players as AWS, Google Cloud or Azure allowing entrepreneurs to select predominantly ready-made cloud solutions.
Do you know who and how takes care of business security in the cloud? Learn all there is to know about data security in AWS
As you look for information about cloud computing, you might stumble upon an opinion that “investing in this kind of architecture involves a great deal of, in fact infinite, security” but what does this statement actually mean? What is security in the context of AWS cloud?
Let’s assume that cloud security means that your data, services and infrastructure in the cloud are protected and, at the same time, continuously, create business value.
Are services offered by AWS safe?
At the moment AWS offers over 175 services, a number that will soon increase… so who’s in charge of all of that?
Who and how is responsible for the security of my business in AWS cloud?
Fortunately, we don’t need to be bothered with that – the security is provided by us and AWS in accordance with the model of shared responsibility.
AWS is responsible for the security and protection of its intercontinental infrastructure (the wiring, buildings, hard drives, appropriate conditions in server rooms, fire protection, and the entire field related to physical access to data).
AWS is also responsible for the availability of the tools and services it offers to their clients (virtualization of physical machines, numerous tools allowing users to better care for their security).
Shared Responsibility Model
So what is our responsibility?
We are mainly responsible for appropriately managing authorisations to access our services, accounts, networks and updating tools, applications and operating systems that we use. We’re also responsible for configuring our networks (both public and private), making sure that the right rules for access are in place (monitoring, responding in case of security threats), encrypting our data (our data are stored on many hard drives scattered over many different locations).
How to make sure that our data are safe? Where should we start? Get to know 3 basis services.
AWS Identity & Access Management (IAM)
This module is responsible for certifying as it answer the question of who’s the user. It’s responsible for deciding whether the user is authorised to perform a given action.
Each AWS tool and service use IAM and constitute a mechanism allowing to define very detailed levels of access (it’s one of four free services in AWS, while the remaining three are theoretically free as they initiate actions on our account that may generate costs in the future).
Elements of the process of certification and authorisation in the context of security of data in AWS
- AWS account root user (the main account, administrator’s account) is used by the physical user for logging in; it has full and unlimited access to all services and resources in AWS.
- IAM User – user’s accounts represent a person or a service which, with their help, interact with resources of the AWS platform. Data for user’s authorisation remain unchanged (login, password, access key, secret key). A change of password might be enforced from time to time, but authorisation data are permanently linked to a particular account. What’s so unique about it? There’s a role related to it.
- IAM Roles – it doesn’t define permissions but a method of certification. It’s an operator/user/machine with temporary certification data. IAM Roles may be assumed by any user in order to have temporary access to a particular operation. In AWS all actions are a reference to the API, which means that every such requirement must be certified and authorised.
- IAM User and IAM Roles are means of certification, not authorisation! So which entity is responsible for controlling our permissions?
- IAM Policies – entities in Json format; the structure of these documents allows them to define access on a very detailed level. Every demand sent to the API of AWS is verified in order to ensure that: a given person is allowed to perform an intended activity? A given person is allowed to perform action on a given resource? A given person is allowed to perform action from a particular source, in a defined period of time, etc.
There are plenty possibilities of granting and denying somebody access. IAM Policies may be assigned to roles, users and groups of users, which are the last entity allowing to quickly manage our collection of permissions.
- IAM Groups – a user can belong to many groups, a group can consist of many users and, importantly, groups may not be nested.
AWS Key Management Service
Responsible for encrypting and decrypting our data; most services, such as AWS, EC2, S3, and RDS data bases support these protections. How does this solution work and how does it protect our data?
AWS MKS is Master Key (the main key used for encrypting Data Key) and Data Key (a key used for encrypting portions of data), in the theory of cryptography described as hierarchical key architecture. This model is executed by AWS.
Many services offered by AWS automatically support the use of data encryption by means of KMS.
What is decrypting actually about?
When using IAM Roles we send a query about data to an S3 Bucket assuming that the data are encrypted with a Master Key. For data stored in encrypted form we need additional IAM Policy permissions that will allow the author of the query to use the key that was used to encrypt the data. Without this permission, even though we might have access to the Bucket and objects stored in it, the API will deny us access.
While this request is being processed, the S3 service makes an attempt at decrypting data. It is not directly seen from the perspective of the user (such attempt will be visible in Cloud Trail or Audit Log).
If somebody doesn’t trust AWS with data encryption, they can always encrypt their data locally in an app, send it in encrypted form and then decrypt it in an app.
It’s the last mechanism aimed at protecting our data, used to create logical, separate network spaces within the AWS platform, in which we manage our resources. Each space is a separate entity with an individual definable IP address in version 4 or 6.
It is by default separate from any other network, including the Internet. In VPC we can fully control network configuration (divide our space into different subnetworks or separate public spaces where we can launch web servers or private spaces where we may launch data as if they were a data base or an app server unaccessible to anyone from the outside).
How is this related to security in AWS?
We can create public and private subnetworks, configure a subnetwork in such a way that no one will be able to connect to it (one-way communication). We may also create Security groups, a Network access control list, or AWS.
The above mechanisms allows us to protect our data, services, infrastructure but don’t miss the documentation shared by AWS where you will find even more information:
- 5 pillars of AWS Well-Architected Framework (Security)
- AWS Security Competency Partners
- other services are available at aws.amazon.com/products/security
Read the article: The 5 Pillars of the AWS Well-Architected Framework: II – Security.
Now we can answer the question: is my cloud capable of withstanding stronger wind?
I don’t think so – most paths and problems have already been tested and solved. On the basis of previous experiences Amazon shares with us – people co-responsible for data and infrastructure security – best practices, recommendations and suggestions.
RST Software Masters – AWS Select Consulting Partner with many years of experience in building microservices. We combine our expertise in the field of business process automation and robotisation with over 20 years of experience in creating highly scalable cloud software.