As access to virtually unlimited compute resources continues to grow, enterprises and consumers are able to do things that could not be dreamt of until a few years ago. At the core of this shift are Social, Mobility and Analytics trends powered by Cloud technologies. As Public Clouds mature and become more secure, the inhibitions to move to Cloud have begun to disappear. Enterprises want to cloudify their own Datacenters to derive the benefit of agility, elasticity and self-service, learning from Public Cloud providers.
The world is going digital at a much faster pace, putting enormous stress on Datacenters. But weighed down by technical debt, Datacenters are unable to change as quickly as the business environment necessitates.
This paper provides an overview on how CIOs should transform the underlying elements of Datacenters – Infrastructure, Applications, People and Processes - to be able to quickly and confidently face the change.
Digital and its Impact on Datacenters
What does ‘Digital’ or ‘Digitalization’ mean? One of the early examples of large scale digitalization was when we converted mail to e-mail. Communications became digital. Today, everything, from commerce to learning is becoming digital.
The spotlight is now on ‘Digital’ because of the exponential rate at which it is disrupting traditional processes and well-established behaviors. Fuelling this disruption are technologies such as Cloud, Internet of Things (IoT), Mobility, Social platforms and real time Analytics. These technologies have opened fresh opportunities to digitize, triggering a race between enterprises to innovate, increase efficiency and engage with customers.
Consumer behavior is changing, new insurance models are being built, the number of steps to complete financial transactions have decreased with even something as deeply entrenched as physical currency threatening to become digital. Paper files are vanishing. Sensors, actuators, microcontrollers and system-on-chip devices are becoming affordable to build cheap robots and drones. Even governments are being overthrown by social media.
The next decade will see more technology change than the last 40 years put together. End users will demand Always-On and high-performance experiences of applications and services, data volumes and velocity will multiply, perimeters will have to be broken to expand to newer geographies or enable much tighter B2B integration. There will be more machines talking to each other than humans talking to machines, and machines will continuously learn, identify patterns and predict events in a world where ‘Digital’ is becoming pervasive. Meanwhile, enterprises are forced into this disruption while most of their IT infrastructure is still legacy
At the core of digital is data. A consequence of the accelerated growth in digitalization is the new demands placed on Datacenters. Traditional application architectures are not flexible enough to adapt to the heavy data velocity, volume and variety. In addition, the IoT is driving decentralization of Datacenters and data security in borderless enterprises is taking new forms.
Envisioning the New Datacenter
The new Datacenter eco-system must respond to the emerging business conditions dictated by a digital world. This means bringing renewed focus to speed of deployment, availability, flexibility, scalability, agility, maintainability, security and affordability.
Programmable Infrastructure: Traditionally, when business needed an addition of, say, 10 servers, the process included procurement, installation, configuration, testing and roll out. This could take weeks, if not months. Today, using virtualization, 10 virtual servers can be deployed in minutes. As infrastructure abstraction gets more sophisticated, more and more control is moving to software from hardware. It is not just humans, but software and applications, which will need to provision or deprovision Datacenter components dynamically based on business demand. This can be made possible when components of the infrastructure - such as compute, network, storage and associated services such as backup, archival, DNS, application services, etc.- are programmable. Every Datacenter component must be exposed through APIs that allow various components to talk to each other and that allow administrators to orchestrate and make changes faster. This is a shift from the traditional manual and time consuming process. It forms the basis of a Software Defined Datacenter with configuration and state monitoring programmable.
Pervasive Fabric: Future Datacenters will be hybrid and distributed in nature. Such Datacenters necessitate a single pane of glass for provisioning, migrating, managing and monitoring the infrastructure and applications deployed across different Clouds, providing workload portability and making the network and service location immaterial to the end-user.
Web-Scale Agility: Few large companies make over 200 changes per day to their applications. All enterprises will want the same web-scale agility within their Datacenters. A DevOps approach of automated development, testing and release management along with microservices-friendly infrastructure becomes necessary to bring this web-scale agility.
DevOps Approaches
Cognitive: Datacenters will add intelligence to their systems so that they develop the capability to learn and auto-scale to meet changing business demands, predict failures and self-heal. Pattern recognition and machine learning technologies will enable this development, now possible with the availability of event and performance data and the reduction in data storage costs.
Cognitive IT
Security: Security concerns have been the biggest inhibitors to expanding Datacenters. Fears around compromised security have grown further as compute perimeters are breaking. But once new virtual boundaries are mapped and enterprise defense-in-depth policies made software defined, security policies can be easily applied to objects irrespective of where the object physically resides in public or private clouds.
Security
Floating Security Principals and Policy Objects
Application Architectures and Development Practices: How long does it take to upgrade a largely monolithic and resource-hungry ERP workload to a newer version? How long does it take to bring up a DR environment in case of a disaster? How long does it take to migrate assets from one Datacenter to other? How much do you worry about redundancy at all levels to keep the infrastructure always on? The new fast-paced world doesn’t have time to think about all these things. That is why the trend of tightly coupled applications - into what seems a monolithic black box - is witnessing a reversal. In their place, microservices are evolving. These are a group of tiny and autonomous services with very minimal or no coupling. They are stateless, can be brought up fast, are portable and can be scaled out. This makes DevOps integration easier than in traditional application architectures.
Illustration:Monolithic vs Microservices Architec
Financial Accountability: Businesses will want to consume Datacenter services on a utility based OpEx model with a swipe of a card. This means creating shared infrastructure with the ability to cross-charge business units based on usage. As a consequence, CIOs will become financially more accountable and need to build business value realization models. Newer KPIs will emerge such as ‘Revenue per unit of Compute’, ‘On-prem vs Off-prem utilization mix’ etc.
Building Blocks of the New Datacenter
Is the IT Pro Dying?
Charles Darwin said, “It is not the strongest of the species that survives, nor the most intelligent, but the one most responsive to change.” After years of installing and configuring systems and servers, or managing and backing up databases, server/ system/ database administrators will have to face the new reality: systems are being automated and their roles are being redefined. The future of an IT pro is in adding smartness to infrastructure. The new world doesn’t want the “learned”, it needs “learners”. Programming or scripting, necessary for adding smartness to infrastructure, is not difficult. They don’t require complicated logic and syntax anymore. And to acquire scripting skills is a necessity to be able to continue as an IT pro of the future.
Transformation Approach
While it is practically impossible to rip and replace decades of technical debt, enterprises can take a phased, two-step approach to transformation.
The Transformation Journey
Summary
It is time to rethink Datacenters to meet tomorrow’s needs. Businesses are changing, digital technologies are creating fresh opportunities, the perimeters of an enterprise are getting broken and conventionally developed skills are getting extinct. We need to fly, yet remain grounded.
It’s an interesting inflexion point and the good news is that these changes are bringing CIOs and Datacenter architects back to the forefront to lead their organizations into the digital era.
Govindaraj Rangan
Govindaraj Rangan (Govind) has 19 years of industry experience across the breadth of the technology spectrum – Application Development to IT Operations, UX Design to IT Security Controls, Presales to Implementation, Converged Systems to Internet of Things, Strategy to Hands-on. With his zeal to remain start-of-the-art, he is quick in developing deep hands-on expertise in emerging technologies and applying in real Customer scenarios. He is currently working on solutions to Cloudify Enterprise Datacenters, expanding their boundaries into Public Clouds, experimenting with IoT and Robotics to build the Datacenter for the Digital Era. He has an M.B.A. from ICFAI University specializing in Finance, M.S. in Software Systems from BITS Pilani and B.E. (EEE) from Madras University. Professionally, he is MCSE, CISSP, PMP, ITIL Foundation certified.
Email: govindaraj.rangan@wipro.com