Lead Image © zelfit, 123RF.com

Lead Image © zelfit, 123RF.com

Kubernetes for small and medium-sized enterprises

Advantage Small

Article from ADMIN 69/2022
By
We look at the benefits of Kubernetes outside of large corporate environments.

Hype is not unusual in IT and is initially met with an innate sense of distrust by admins, who tend not to want to deal with all of that new-fangled stuff. This was the case with cloud computing – which many thought was a flash in the pan – as well as during the rise to fame of the fleet orchestrator Kubernetes, which many observers categorically ruled out.

In 2022, it's clear that Kubernetes and containers are here to stay. In many places, the benefits of the solution are so huge that admins quickly turned from container skeptics to container enthusiasts, especially in large companies and corporate environments. According to a survey by SUSE [1], more than 60 percent of companies whose IT budgets exceed EUR10 million per year have cloud-native applications, whereas cloud-native is a distant also-ran among smaller enterprises.

It's easy to get the impression that containers offer little or no added value for smaller companies. But is that really true? Do small and medium-sized enterprises (SMEs) really have nothing to gain from Kubernetes if their IT fleet is smaller (i.e., if they don't have a server fleet of thousands of devices)? In this article, I explore this question and show how even smaller companies can benefit from Kubernetes.

Defining the Terms

To identify the benefits of modern technologies, even for smaller enterprises, it is useful to define the terms clearly. Although terms such as container, Kubernetes, cloud-native, and microarchitecture are mixed up in public discourse and sometimes used as synonyms, one thing is clear: Containers and Kubernetes are not the same thing, and the use of containers does not mandate the use of modern applications that follow the microarchitecture approach.

Therefore, it makes sense to look at containers and fleet management separately to identify the sweet spots for SMEs. Initially, I look exclusively at containers. By definition, a container is simply the filesystem of a (minimal) Linux system in combination with an application installed in it. Depending on the container technology used, you have the option of connecting volumes to a container as persistent storage, which ensures that conventional applications such as databases can be operated in a meaningful way in containers.

Classic Setup

To discuss the practical benefits of containers in everyday life, it helps to take a look at the classic IT setup of the noughties, which is still in use in many places today. Rest assured, it has very little in common with the modern container world.

In these typical setups, each system is assigned a fixed task. Because not even virtualization plays a role in many cases, a static link exists between a system and the application that runs on it. The underpinnings for such a system are almost always one of the classic Linux distributions: Think AlmaLinux or Ubuntu.

Although many companies claim to have achieved a high degree of automation, these kinds of systems are often enough tedious, manually packaged individual installations. Because every modern distribution comes with a package manager (e.g., RPM or DPKG, Figure 1), companies still make extensive use of package managers in many places. The required userland software is therefore installed on the system as a package, and the admin configures it either manually or uses some kind of automation. So far, so familiar.

Figure 1: In classic environments, managing software is the package manager's job, which often causes massive dependency issues because packages from different sources collide.

Every admin with some professional experience has heard the term "dependency hell," which refers to a problem that often occurs on Linux systems when the admin cannot or does not want to make do with the package pool offered by their choice of distribution and includes external software directories. In many cases this situation is unavoidable.

If you want to have up-to-date software on older distributions, your only approach is typically through the software vendor's own repository, and if the vendor then decides that the collection of packages included in, say, CentOS 7 is not enough and that the extra packages for Enterprise Linux (EPEL) directory are also required, it's the administrator who has to handle it. With an update to a newer version of the Linux distribution, it is not uncommon for systems to blow up in the administrator's face. The parties involved – CentOS, EPEL, and the software vendor – then usually sit back and refute any responsibility. Of course, this is not much use to you as an administrator if you are left without working computers.

Linux distributors recognized this problem several years ago and actively countered it with containers. The big advantage of a container is that it comes with its entire userland in tow. A ready-made container image can be run without any worries on any system that has a runtime for containers – even if it has no additional software, except for the basic components and the container environment.

Because this approach also offers huge advantages for Linux distributors and program providers, the container-based approach has established itself as the de facto standard. Containers offer software providers in particular the ability to deliver their solutions to customers exactly as tested in their own laboratories. They just build one container with a runtime environment for all Linux distributions, instead of a package for every version of every distribution, split into RPM and DEB packages (Figure 2).

Figure 2: In container-based environments, applications and their entire userland run in the container. Dependencies no longer play a role, which is not always the case with the package manager for your choice of distribution.

Simplification and Greater Efficiency

Simplified management of individual systems should already be incentive enough for small businesses to take a closer look at container technology. After all, if the IT department is not particularly large, you are absolutely dependent on a high level of efficiency. The less overhead the production systems cause in daily life, the better it is from the company's point of view. Containers initially fulfill this requirement at the application level. If an application can be rolled out in the shortest possible time without external dependencies so that it only requires a configuration file and persistent memory, this situation translates to significantly lower overhead than installing a package.

Moreover, this procedure facilitates the update process. Administrators periodically update their container workloads by stopping the old container after they have downloaded a new one and connecting the existing persistent volume to a new container launched on the fresh image. Most applications that manage data identify records from a previous version when they encounter them and automatically perform an update, as is the case with MariaDB, for example. Immediately afterward, the new version of the database is up and running. Not only that, if something goes wrong during the update, it is relatively easy to return to the state before the update with the old image and a snapshot of the persistent storage.

Containers also offer simplification and improved efficiency at the system level. If the company's own systems only run containers, many everyday maintenance tasks are eliminated. For example, it no longer has to be a complete CentOS; instead, you can use CoreOS (i.e., a minimal operating system). Although the operating system can't do much apart from running containers, that is exactly what the doctor ordered.

Buy this article as PDF

Express-Checkout as PDF
Price $2.95
(incl. VAT)

Buy ADMIN Magazine

SINGLE ISSUES
 
SUBSCRIPTIONS
 
TABLET & SMARTPHONE APPS
Get it on Google Play

US / Canada

Get it on Google Play

UK / Australia

Related content

comments powered by Disqus