Grzegorz Wierzchanowski
Technical Architect
Reviewed by a tech expert

Cutting costs of microservices in the cloud using containerisation and ECS – part 2/2

#Sales
#Sales
#Sales
#Sales
Read this articles in:
EN
PL

After reading the previous part ("Cutting costs of microservices in the cloud using containerisation and ECS – part 1/2"), you already know what options you have to configure your infrastructure in AWS, and which of them you should choose in your project. Now, you will learn ways to decrease the costs of environments built for developing your product.

If you want to start your microservices adventure, check out the “Microservices or Monolith” free e-book.

Cutting costs of microservices in the cloud using containerisation and ECS

How to cut costs of temporary environments?

So, how can AWS allow us to decrease the maintenance costs of temporary environments? Here are a few tricks that allowed us to cut the costs fourfold:

Auto Scaling to zero

The ability to automatically scale services is not that useful in test environments, as we simply don’t need to increase our resources. However, we can cut them to zero. Why would we want to do that? Well, for an obvious reason—in the afternoon, when people have already finished their work, our EC2 services are still functioning.

ECS allows you to set a trigger to change auto scaling settings at a certain time. By configuring to disable the entire cluster at 5:00 p.m., we decide that AWS won’t generate unnecessary costs. Likewise, we specify another trigger to 7:00 a.m. to share at least one EC2 instance with the teams.

We can do that by modifying EC2 settings (Auto Scaling Group).

Cutting costs of microservices in the cloud using containerisation and ECS

You can find an appropriate option in the AWS console in EC2.

Cutting costs of microservices in the cloud using containerisation and ECS

After selecting a group, we can schedule specific actions for it.

Cutting costs of microservices in the cloud using containerisation and ECS

When creating a trigger, we can define its start time, recurrence (it can be enabled only during weekends), and its end time.

By using an infrastructure with containerisation and ECS, we can take advantage of the ability to launch the entire environment in a cluster. Thanks to that, we will be able to control all our services with the use of a single group, or disable any service regardless of its scaling inside the cluster, as it has been defined separately in ECS. After 7:00 a.m., everything will be configured just the way it was at 5:00 p.m. on the previous day.

EC2 Spot instances

Using the Launch Configuration, we can define which EC2 instances will be created for our group (Auto Scaling Group). One way to save money here is to use Spot instances, which are much cheaper, but carry the risk of losing access to your instance purchased at an auction. Fortunately, in DEV and RC environments, the availability of cheap instances is not a priority, so you can easily set them as your preferred choice. With this one simple step, you can lower EC2 instance costs by up to 90% for all microservices in your test environments.

Cutting costs of microservices in the cloud using containerisation and ECS

During configuration, simply select the right option and define the maximum price you are willing to pay for leasing processing power.

RDS – one instance for multiple databases

Obviously, a microservices architecture is designed to keep all databases separate. If one service fails because of database-related issues, other services must not be affected. However, this general rule can be broken in a programmer-only environment. RDS is another service that can be very costly. Creating multiple databases in one RDS instance lets you save precious dollars. All services should define access to a database in external settings (AWS Parameter Store or environment variables). Defining a single destination and creating appropriate databases and users should also be simple.

Add-on: Instance Scheduler

And what if our RDS instance must maintain many services, each one operating on sizeable datasets? We must ensure that the instance has sufficient resources or simply start scaling it. However, that can still cost a lot of money. Enter Instance Scheduler — a solution to this problem. You can use Instance Scheduler effectively even if you decide not to change your infrastructure into a single RDS instance.

This tool can be quickly deployed in our environment using simple instructions and automation provided by Cloud Formation.

Cutting costs of microservices in the cloud using containerisation and ECS

Instance Scheduler is a collection of several AWS services that work together to trigger appropriate actions in RDS and EC2 instances.

After launching the tool, we add a TAG (defined in an installation) to our RDS or EC2 instances that should be suspended and resumed. This tool is also used to manage EC2 instances. Unfortunately, when using an Auto Scaling Group, we won’t be able to use it like that — AWS will resume our instances right after Scheduler suspends them). Next, we set a timeframe in a DynamoDB table during which our databases must be fully ready. Of course, the data is safe, and we don’t need to worry it will disappear from the database the next day.

Using ECS and adhering to these tips can save you up to hundreds of dollars each month. Teams will appreciate the simplicity of implementing changes in containers and the ability to instantly scale services for testing purposes.

If you want to learn about the rules that govern the world of microservices, read the “Microservices or Monolith” free e-book.

People also ask

No items found.
Want more posts from the author?
Read more

Want to read more?

Cloud & DevOps

IAM best practices – the complete guide to AWS infrastructure security

Explore essential IAM best practices for robust AWS infrastructure security. Elevate your cloud security with expert insights.
Cloud & DevOps

What is data ingestion and why does it matter?

Explore the importance of data ingestion and its significance in the digital age. Harness data effectively for informed decisions.
Cloud & DevOps

Snowflake vs Redshift: key differences and use case scenarios

Explore the differences between Snowflake and Redshift. Find the right data warehousing solution for your specific use cases.
No results found.
There are no results with this criteria. Try changing your search.
en