Is containerisation the right way to go? How to make the right choice
There was a time when it was a simple choice: dedicated or shared. That decision is still there to be made, but for many, there are additional considerations about what’s under the hood. What is containerisation and would it suit your needs? What kind of server hosting is required to run your containers? And what is “serverless”? Fasthosts offers a demystifying overview of the different options available, how they could be used, and who is likely to use them.
Traditionally, you would run an application via a web hosting package or dedicated server with an operating system and a complete software stack. But now, there are other options.
Containerisation, or operating-system-level virtualisation, uses a platform such as Docker to run isolated instances known as containers. A container is a package of software that includes everything needed for a specific application, functioning like a separate server environment. Sharing a single OS kernel, multiple containers can run on one server or virtual machine (VM) without affecting each other in any way. To the user, a container feels like its own unique environment, irrespective of the host infrastructure.
Containers can perform tasks that would otherwise require a whole server or VM, while consuming far less resources. They’re lightweight and agile, allowing them to be deployed, shut down and restarted at a moment’s notice, and easily transferred across hardware and environments. Because containers are standalone packages, they behave reliably and consistently for everyone, all the time, regardless of the local configuration.
When we talk container orchestrators, you may find that Kubernetes is frequently mentioned. There are several out there, but Kubernetes is the leading container orchestration tool, filling a vital role for anyone who needs to run large numbers of containers in a production environment – on one or more dedicated servers, for example. Kubernetes automates the deployment, scheduling and management of containerised applications. It automatically scales containers across multiple nodes (servers or VMs) to meet current demand and perform rollouts seamlessly, while also enabling containerised applications to self-heal: if a node fails, Kubernetes restarts, replaces or reschedules containers as required.
As with traditional web hosting solutions, you can choose whether to run your containers in a shared environment, where you will likely get the best value for money if you have relatively small workloads that will not fully utilise resources of a whole cluster of nodes (VMs or servers). But if you have larger workloads or regulatory obligations to meet, a dedicated environment, or even your own cluster, may be required.
In serverless computing, the orchestrator will automatically stop, start and scale the container on the infrastructure best placed to handle the demand at that time. This means that the developer has even less to be concerned about; code runs automatically, with no need to manually configure the infrastructure. Costs are also minimised, with all instances of a container automatically shut down when demand for it disappears.
“Microservices” is another term often used when discussing containers. Simply put, a traditional application is built as one big block, with a single file system, shared databases and a common language across its various functions. A microservices application reveals itself behind the scenes, where functions are broken down into individual components; for example a product service, a payment service, and a customer review service. Containerisation technologies like Kubernetes provide platforms and management tools for implementation, enabling microservices to be lightweight and run anywhere. Microservices can technically be built on traditional server hosting, but the practical reality of creating and maintaining a full microservices architecture demands a container platform like Docker, and an orchestration tool like Kubernetes.
Fasthosts remains focussed on building on these systems, with container technology firmly placed as the platform of the future.
Product manager Gavin Etheridge is confident about the future of containers: “Our CloudNX Apps & Stacks services are already built on these technologies, and we continue to take what we’ve learned and apply it to all our products. We use these technologies internally – we’ve been the guinea pigs ourselves – and our underlying platforms have become more resilient, with the additional benefits of self-healing. In the years to come, the development and adoption of containers will likely continue to accelerate.”
There are no comments at the moment, do you want to add one?
Write a comment