DevOps. It’s the buzzword of the moment. According to Gartner, it’s growing so fast in popularity that a quarter of Global 2000 organizations will deploy it by 2016.
So what exactly is it? And why is everyone talking about it?
In simple terms, DevOps brings the development and operations teams within an organization closer together. It might sound straightforward but it’s a little more complex than that. It’s an approach to software development that strengthens communication, collaboration, integration and automation.
We believe that at the core of DevOps are four crucial elements: speed, quality, control and cost. Speed is fundamental to competitive execution and market positioning. Quality is vital to successful implementation and long-term viability. Control, the command of data use, security, access and process is crucial to safe operations. And cost, of course, is a pivotal consideration in nearly all business decisions.
Need for Speed
The need for speed within enterprises is essential and the pressure for faster execution constantly increases. Applications are a number one priority for CIOs because the results they bring are a number one focus of CEOs, customers and shareholders.
Every line of business will regularly have new requirements. A new app or changes to improve existing apps are vital. So, if the essential IT mission is to host applications they need to get all that surrounds them accomplished quickly. Speed is critical.
Speed challenges can be divided into two categories -- process and physics.
Consider this process scenario -- before data virtualization -- where a developer needs a production database copy for a new project. Data access is requested, and a ticket submitted for approval. Next, someone needs to provision, compute and then storage for the physical data copy. Typical elapsed process time is two weeks to a month. Maybe more. Seldom less.
There’s still the physics problem. Suppose the first physical production data copy is ready. Now additional physical copes are needed to satisfy prototyping, QA, user testing etc. Time is eaten by the need to create multiple copies.
Now, introduce data virtualization to automate workflows, enable on-demand data access and deliver provisioning measured in minutes instead of days. For DevOps, it’s an accelerator that enables rapid development, testing, release and refresh of applications with as much as 90 percent reduction in provisioning times. Data virtualization means application development with an exceptional edge. It means incredible speed.
Improve Quality
The transition of application development to a comprehensive DevOps environment is a natural but not necessarily smooth progression and quality is no less important than speed. The intent is simple: accelerate quality software development. Catch and fix bugs early. Minimize last minute surprises. Release products timely.
If business differentiation comes in part from the quality and sophistication of applications, DevOps helps. It’s designed to remove barriers between operations and engineering -- to improve quality. It can smooth and speed communications and help integrate development, operations and Quality Assurance functions. Customer satisfaction improves and profits increase. It means that DevOps is a connected, interactive and collaborative unit that makes businesses more successful. As long as it’s done right.
Application development is a complex and demanding process. With access to multiple near instant virtual data copies, quality issues and time delays get substantially reduced. With minimal storage consumption, applications can attain maximum scalability, consistency, data control, automation and ease-of-use fundamentals. All of that helps DevOps improve quality while accelerating development and release cycles.
Control Is Key
In product development data has to be both accessible and secure. It’s a tricky balancing act, made all the more difficult by excess physical copy growth. More data copies will just increase the "attack surface".
So, the idea is to create fewer physical copies, decrease the number of security targets, mask sensitive data, create an audit trail and reduce overall risk.
The control of sensitive data starts with the reducing excess physical copies. What’s essential is that the system incorporates all key technical standards and multiple levels of data security that will address physical, virtual and hybrid environments. It’s fast, simple to understand and operate. It supports and helps to reinforce broader enterprise security strategies.
Reducing Cost
Cost is always an essential element. The truism of IT budgets always applies: "Do more with less". There’s never more budget for IT and that energizes searches for more efficient ways to operate. Doing it this way makes the ranks of CIOs and CTOs and even CEOs open to new thinking -- and transformational change.
We know that adoption of DevOps provides a very effective opportunity to enhance application quality and time-to-market. Data virtualization adds to the benefits significantly.
In effect, costs are reduced in multiple dimensions -- staff costs, capital costs, quality costs, complexity costs, and, most important in the eyes of many business leaders, the costs of time. The right data management process helps businesses operate faster and more efficiently. As new software is released more timely to customers, organizations add value faster. They are more competitive and see direct impacts on revenues and profitability.
DevOps has now become a key part of enterprise IT planning and if it hasn’t been employed it will certainly be considered.
What’s clear is that when combined with data virtualization, it can be an incredibly powerful tool in the changing world of business when gaining a competitive advantage is paramount.
Ash Ashutosh is the CEO of Actifio.
Published under license from ITProPortal.com, a Net Communities Ltd Publication. All rights reserved.
Image Credit: alphaspirit / Shutterstock