The ever-changing landscape of modern applications

Mohamed Abukar
5 min readAug 18, 2021

Containerising applications has become the norm in the IT/DevOps industry, allowing developers to condense their application code and all its dependencies in a lightweight and portable environment so that these applications can be deployed anywhere virtually.

Before going into containers, we must first understand why they were designed.

Let’s just say a Python application has been developed, the next logical step would be to deploy it; this can be complex as well as time-consuming. This is because a running app requires a myriad of configurations, ranging from the allocation of resources to setting up environmental variables.

Another consideration for the application is the ability of it being portable. If a developer needs to change the deployment environment or move to another cloud provider, they will need to repeat many steps of the deployment process. Once again, this can be a mundane process.

Containers efficiently solve the issues by packaging the codebase for the application with all the required dependences — including tools, libraries, config files and run-times. This process of containerisation ensures a lightweight and secure package (depending on how it’s done) known as a container image.

This image can then be used to deploy a containerised application in any environment required, thus giving the developers the much-needed freedom to design and develop apps without having to think about other constraints. Some common container runtimes are Docker, containerd and CRI-O.

Cargo ships in reference to containers, credit: GETTY IMAGES

Many large IT organisations became frustrated with 4–6 month releases and sluggish deployments. Organisations like Netflix and many others began to embrace microservices, thus helping them overtake their fellow competitors — who also had a similar plan. So applications began being built using microservices, heavily transforming into a more DevOps and Agile culture; larger teams were split into smaller ones to allow for this transformation.

DevOps cycle releases

Then came along issues whilst building microservices. Amongst those issues was containerising applications so that they are deployable on any environment, including a Dev’s one. As mentioned before, it was the idea of containers that solved this issue and then this lead to another issue which was managing these containers. This is where the story of Kubernetes began.

So, Kubernetes is a container orchestration platform or in simple terms, an application management platform.

Fun fact: The name Kubernetes itself stems from an ancient Greek word for “helmsman”, meaning he who steers a ship thus telling the story behind the K8s logo. Another name for Kubernetes (you hear it quite often) is “K8s” — the reason for this name is, the 8 letters between the “k” and the “s” in Kubernetes.

Note: From now on, K8s may be used to refer to Kubernetes sporadically.

The Galver depicting the Kubernetes control mechanism. credit: GETTY IMAGES

Why Kubernetes overtook its competitor OpenStack?

Although K8s and OpenStack have produced similar level of interests, they are slightly different from each other. In the end, Kubernetes has succeeded where OpenStack fell behind. The reason being, Kubernetes’ ability to provide business value and focus on the bigger picture which is the Enterprise IT applications. On the other hand, OpenStack orchestrated infrastructure and no matter how good your infrastructure management is, if you are not solving business problems, what use is it?

Okay, so how does Kubernetes help me or my company ?

As mentioned before, Kubernetes enables a “microservice” approach to building modern application and larger teams can be split into smaller ones — leading to a much faster time to market, deployments of applications are always being updated and later versions are released.

Costs of IT infrastructure are more optimised. K8s can help your organisation largely cut infrastructure costs especially if they’re operating at a huge scale. Prior to the introduction of K8s, IT and Sys Admin, commonly over-provisioned their infrastructure to handle unexpected spikes. K8s is wise enough to schedule and put together containers, while taking into consideration available resources.

Nowadays, the success of applications doesn’t merely depend on features, but also on the scalability of the application. For indeed, if an app cannot scale properly, it will not perform at its best. Kubernetes is critical in this process as it scales and improves app performance. Let’s say we have a service which takes up high-CPU and its quite dynamic, meaning it changes based on business conditions; the solution required here would be something that can scale up the app so that new VMs are automatically created or scaled down when the load is reduced. Kubernetes offers just that capability by scaling up the app as the CPU usage goes above a certain threshold and scales back as a result, optimising the infrastructure utilisation. This feature of K8s auto-scaling known as Horizontal Pod Autoscaler (HPA) is not limited to managed metrics and custom metrics can also be used to trigger the auto-scaling process.

This article mentions only a few of the main benefits of Kubernetes and its various benefits can be spoken about for days.

Here is a list of Kubernetes user case-studies that shows how K8s has transformed their organisations:

And for more learning on Kubernetes:

What is next?

I’m currently on a journey to learn more about the ever-growing ecosystem of containers, container orchestration and the DevOps culture. Expect to see more articles and open-source contributions. Stay tuned for more !

Will be sharing this learning journey in these places:

--

--

Mohamed Abukar

AWS Community Builder | Red Hat Accelerator | Platform Engineer @ Trainline