Why Bikes Are Like (The New) Data Center Containers

Why Bikes Are Like (The New) Data Center Containers

Posted by: Dominic Cheah

Containers have been around for years and there’s not a person on earth who has not utilized containers to organize, pack and carry stuff. Containers are versatile and readily available and it’s hard to imagine organizing our lives without them.

Containers are also available in data centers. These types of containers are ephemeral and allow applications and their dependencies to run in resource-isolated processes — in this case, modular application blocks. In my opinion, container solutions such as Docker or Kubernetes are analogous to bicycles.

A basic bicycle is simple and yet effective. And because they can easily navigate around traffic and deliver any package quickly and efficiently they’re the perfect solution for making quick deliveries in cities. Bicycles are also easy to build and mass produce. So are containers.


Choices, choices…stateless or stateful?

Containers are immutable, meaning that container-based images are consistent and cannot be modified. Containers without persistent storage are stateless and do not keep any previous session information. With stateful containers, application data survives outages and removal of containers.  Application data is durable because it resides on persistent storage, independent of the containers. Because stateful containers leverage persistent storage elements (volumes), those persistent volumes can also be shared with other containers.


Making Dev and Ops teams get along with containers

Containers are popular in continuous integration and deployment (CI/CD). Containers in CI/CD eliminates the finger pointing and unknowns between Dev and Ops teams because the source and target environment can be consistently built from the same base image. Using containers, software (applications) can be built and deployed on standard images, regardless of what the target environment may look like.

Containers can also be used for microservices such as a web hosting service that require resiliency and elasticity for a consistent user experience anytime, anywhere. This is possible because loosely coupled services can be developed and deployed separately without being dependent on each other. With containers, hundreds and thousands of microservices can be easily connected to form an application.

Some would also agree that containers can help modernize large scale monolithic applications. Managing and maintaining legacy systems can potentially  consume lots of valuable IT resources because most enterprise applications need to be resilient, highly available and highly scalable. Containers are lightweight and very efficient in packaging and isolating application workloads. Because containers allow services to be built independently of each other, Those services can run in hundreds or thousands of containers on the same physical or virtual server without issues. Organizations can easily consolidate data centers and integrate IT efficiently with containers.


Just try it, don’t even buy it

There are many container solutions to choose from. For example, Docker Enterprise Edition Trial is available for free. I believe, the best way to learn a new technology is to deploy and try it out. Docker containers are supported on most infrastructure and can run on VMs and on bare metal Linux servers.

Deploying Docker is easy. On a Linux system, make sure that the latest stable Docker version is picked up using the following command: 

  • sudo yum-config-manager –add-repo <DOCKER-EE-URL>/centos/docker-ee.repo

 Next, run yum install to install Docker on the Linux host, as shown in the following command:

  • sudo yum install docker

It is just that simple to deploy container solution such as Docker on a Linux host.


Infrastructure matters, even with containers

A well-built infrastructure ensures your commute is smooth, safe and free of issues. It is no different for container technology as well. Container solutions such as Docker should be deployed on infrastructure solution that ensures application data is always available, durable, secure and can scale-out base on the data center requirements. The IT administrator managing the IT organization should have choices in being able to deploy Docker in VMs or on bare metal Linux host without looking at multiple vendor solutions.

The approach to data availability and security with containers should leave everyone without having to make any compromises, meaning application data should always be protected in-use, in-flight, and at-rest without performance sacrifices. Open Convergence is built on these tenets. It’s well-built (no-compromise) infrastructure for today’s IT organization.

y:inline;"> line;">