Opcito’s recent blog, DevOps A Culture More Than A Practice, accentuated how DevOps is aiming to establish a culture and environment to revolutionize the IT industry by improving performance, increasing productivity, efficiency, and encouraging innovation. Simplest and a very generalised way to put DevOps is, breaking down barriers between your development and IT operations team to strike a perfect harmony between processes which will lead to faster, coordinated delivery of software. However, to attain this perfect synchronization between your Dev and Ops teams there should be a medium over which these two can communicate and compute. The majority of businesses are talking about making significant progress in their DevOps initiative but there can be lot of difficulties in actual DevOps pipeline. The shared environment can be one of them, where your IT processes have to wait in a long queue for their turn to utilise the resources. Virtualization, especially of your machines, is one of the solutions which can solve some of your problems but again it has limited resources for storage and compute and you have to wait for human approvals to utilise them. Even if we automate the approval process, the limited set of resources still poses a problem which needs to be answered.
This problem can be solved by virtualization of an operating system that allows multiple isolated instances, which we call containers. Containers can make it easier to host multiple applications inside portable environments which in a way solves most of the problems with DevOps.
How containers solve the problems associated with DevOps pipeline?
Containers can solve the problems associated with Dev and Ops teams by establishing proper communication, making the overall DevOps team organised, efficient and secure because of the following attributes:
- Consistent environment for development, testing, and production
- Faster deployment and easy updates installation
- Support for multiple technologies and frameworks
- Abundant tools like Docker, Gitflow, Jenkins, Kubernetes, Ansible
Chris Morgan, Technical Director at Red Hat, OpenShift Partner Ecosystem, in his keynote at 2016 All Day DevOps conference stated, “DevOps has a 74% adoption rate (at least on a path towards full adoption) at the enterprise level.” But the same survey revealed another important figure – about container adoption in enterprises which is 18%. So, if the container environment provides all the benefits which we have already discussed, then why the rate is low. There are few difficulties while deploying containers on multiple hosts, packaging and some concerns like security, integration and deployment flow. As far as container security is concerned, we can secure our containers at orchestrator level with the help of different infrastructure monitoring and security tools like Prometheus and RBAC provided that you are using Kubernetes and at the container level with vulnerability static analysis using tools like Docker scanning services, Twistlock Trust, Clair which can be integrated in CI/CD pipeline. Another possible solution to this is CIS certified container images (CIS is a benchmark for containers). In addition, we can do periodic network security auditing. Other than security aspects there are few things you should consider while you are going for containers.
What are the things to consider before you gear up containers for DevOps?
There are lot aspects which need to be considered while you are going for any software environment adoption. Similarly, these are the aspects which you need to consider before you introduce containers in your DevOps:
- Support infrastructure
- Dependency and Integration testing – integration with databases, third party services, and new containers, Dependency for hosts, environments, and capacity of containers and hosts
- Scalability of your production environment
- Contingency plans in case of the container failure
- Container monitoring systems and tools
More than just containers
If you are to leverage containers for DevOps, and you are thinking about the aspects which we discussed in the earlier part of the post, there are some inherent questions which need to be answered which can set the course of your container initiative and DevOps as a result. Hosting your containers is one of them and there are only two answers to it viz. on-premise and cloud. Each has its own pros and cons, let’s consider them one by one.
DevOps, Containers, and On-premise:
The meaning of on-premise is changing very fast with the characteristics of cloud like high degree of virtualization and relative independence from hardware constraints are easily applicable to on-premise. It’s just like a private cloud with the hardware present on your site and maintained by you. Many major players provide support for on-premise deployment of DevOps and containers. Generally, this is advisable for organisations with a limited set of instances and dev-test cycles which can be easily managed within the walls of your on-premise facility. Direct access to hardware, security, and local monitoring are some of the advantages and you always have the option to migrate to cloud gradually.
DevOps, Containers, and the Cloud:
Advisable when you want to save yourself from the hassle of maintaining the infrastructure and you have many monolithic processes running and you are not able to or willing to go for microservices. In addition, you can leverage virtualization, continuous delivery and other such features to your advantage.
On-premise + Cloud:
This option can provide you with advantages of both on-premise and the cloud. You can deploy an application on containers in the public cloud, with some functions running on on-premise containers. This kind of arrangement can give you control over security and access and at the same time you can get the flexibility and cost effectiveness of the cloud especially public. Another approach could be to develop and test locally and then deploy in the cloud.
There are organisations which are adopting the DevOps culture, and containers can help you get more out of your DevOps. We have seen the problem associated with DevOps and saw how containers can ease your way through these problems. There are aspects which you need to consider while planning your container initiative. Choosing the right option for your container hosting and addressing the security and monitoring aspects can be a task but Opcito’s DevOps and Container expertise can solve all the problems associated with hosting and implementation. As I have said earlier, DevOps is to set right coordination between your dev and IT ops team. And with the help of containers, you can take your DevOps pipeline to next level. Plus, you have a number of tools like Docker, Mesos, Kubernetes, CoreOS, Gitflow, Jenkins, and Ansible which can reduce your automation, orchestration and monitoring efforts. If we can strike the right chords, we can go from requirements to production code within a matter of time.
Leave Comment