Why does IT governance fail so often?
6 Benefits of ISMS Implementation
7 Ways Value Stream Tool Integration Can Improve Your Software Quality
Securing Microservices: Strategy to Implement Security for Microservices
Zero Trust Architecture: What It Is And Best Practices For Implementing It
Things to Consider When Developing a Financial Services Application
Leveraging the power of Value Stream Intelligence
Performance Testing using Docker/Container
Succeed in Digital Transformation by Choosing Path of Least Resistance
There are many ways to DevOps but one thing is certain as indicated in several prior blog posts, it is a culture, a mindset change and a prescriptive approach than rather being a process to be implemented in a specific way. It is much more than being a process and hence there are multiple ways to get there, what works for an enterprise is for them to determine. Fundamentally, the agonizing gap between the Dev and IT Ops has to be eliminated and an Enterprise must do everything in their control to bring that to near zero.
In exploring several possible options to achieving Continuous Delivery, I realized that the biggest divide exists in vastly different Development Environment Vs. “rest of the enterprise” Environments leading to last minute scramble by either the Dev or the IT Ops teams and typically leading to a Hail Mary effort to get the release completed. This trend must be arrested and it can! Bringing in several things discussed – Are Deployments standing between you and the happy hour will certainly help. In addition to the already discussed areas, taking a Container Approach can most certainly alleviate some of the struggles an enterprise would go through in this journey.
First, lets take a look at what is needed.
When evaluating new IT investments such as DevOps, Open Source provides great help in figuring out the key aspects of DevOps and what you could expect from DevOps and if it fits into your needs. It also helps you evaluate Commercial offerings better, and if they solve a problem that Open source technologies do not. But as with most open source, there is a problem, figuring out which Open source tools would help you achieve an end-to-end solution to your problem and most importantly how to stitch all of them together and make them play well together. Mind you, there isn’t one tool that solves all your woes – its all about Orchestration of all available resources.
DevOps is a chain and a platform that demands all players contribute and contribute big! Thus the need for multiple tools for achieving Planning, Source Code Control, Build and Integration, Automated Testing, Configuration Management, Deployments and Monitoring.
Below is a list of few tools that could be considered:
- Git or SVN for Source Code Control
- Jenkins for Build and Integration
- Cucumber, Selenium for Testing
- Chef, Puppet or Ansible for Configuration Management
- Nagios, Icinga for Monitoring
The challenge I have seen at most enterprises is this becomes a toolset that they invest in and end up hiring a team - The DevOps team to manage these tools losing focus of the end goal at times – bringing everyone together, rather than getting a toolset or team. Lets pause that discussion right there. Back to tools! Now, the set up of these tools can be challenging, considering the requirements for each, the integrations and the environments they demand. Without bringing onboard a ton of individual SMEs, What is the quickest way to set up all these tools, without having to invest on servers? Enter Docker.
I have followed Containerization for a bit there and I see that Docker is the most popular containerization platform that allows you to create multi-container a.k.a docker-in-docker deployments and brings a lot of flexibility in the way a container (one stop to address all your integration woes) application can function. Essentially, the developer might as well be working on the replica of a production container and so can everyone else in the value chain – Automated testers, release, config management et al. The one argument I hear at this stage is why not AWS or similar virtual infrastructure – I’ll address the Container Vs Virtualization in a different post – but to summarize Docker or any container essentially wins this argument over through the ease of instantiation, convergence, integration resulting in reduced lead times and eliminated wait times.
Consider this, during the planning phase, the Dev, Test, Release and Ops teams can essentially define the actual infrastructure including the base OS, DB versions, middleware and services stack and the actual run-time environments and the application into the same image. Docker Compose helps you compose your end to end DevOps environment using a Dockerfile and just by doing a “docker-compose up” will have your environment up and running.
This “Converged Isolation” eliminates the variations that can occur in environments – Dev, Integration or production, thus giving the developers the super power – developing, building and testing in a production like environment from the get go, thus bringing more agility and velocity across the lifecycle.
Once you have all these tools orchestrated and integrated together, you have a true end to end DevOps environment ready to be utilized for your purposes, and when you are ready you can utilize any Container Services to deploy your DevOps environment for your full blown implementation.
Note that orchestration of different tools in the lifecycle is key to achieving end-to-end DevOps, and not the tools themselves. If you are sure implementing DevOps is way forward for your organization to achieve higher quality, continuous delivery but are not sure where to start, or need help with orchestrated engineering, Qentelli provides orchestrated engineering, quality intelligence, continuous delivery services and products to accelerate your implementation.