Gartner: Future of the Data Center is Software-Defined
September 30, 2015

The software-defined data center (SDDC) is crucial to the long-term evolution of an agile digital business according to Gartner, Inc. It is not, however, the right choice for all IT organizations currently.

"Infrastructure and operations (I&O) leaders need to understand the business case, best use cases and risks of an SDDC," said Dave Russell, VP and Distinguished Analyst at Gartner. "Due to its current immaturity, the SDDC is most appropriate for visionary organizations with advanced expertise in I&O engineering and architecture."

An SDDC is a data center in which all the infrastructure is virtualized and delivered "as-a-service." This enables increased levels of automation and flexibility that will underpin business agility through the increased adoption of cloud services and enable modern IT approaches such as DevOps. Today, most organizations are not ready to begin adoption and should proceed with caution.

By 2020, however, Gartner predicts the programmatic capabilities of an SDDC will be considered a requirement for 75 percent of Global 2000 enterprises that seek to implement a DevOps approach and a hybrid cloud model.

"I&O leaders can't just buy a ready-made SDDC from a vendor,” said Russell. “First, they need to understand why they need it for the business. Second, they need to deploy, orchestrate and integrate numerous parts, probably from different vendors." Moreover, aside from a lot of deployment work – new skills and a cultural shift in the IT organization are needed to ensure this approach delivers results for the business.

Gartner recommends that I&O leaders take a realistic view of the risks and benefits, and make plans to mitigate the top risks of an SDDC project failure:

Assess skills and culture

Simply changing a legacy infrastructure for a set of software-defined products is unlikely to yield the desired benefits. Before an activity is automated and self-service is implemented, the process associated with the IT service needs to be completely rethought and optimized. This may require new skills and a different culture to what is currently available within certain IT organizations. " A broken process is still a broken process no matter how well it is automated," said Mr. Russell. "Build the right skills in your organization by enabling top infrastructure architects to experiment with public cloud infrastructure in small projects, as well giving them the opportunity to get out and learn what their peers in other organizations and visionaries in this field are doing."

Know when the time is right

The right time to move to an SDDC may be years away for most organizations, but for many it will come sooner than their preparations allow for. "The first step is understanding the core concepts of the SDDC," said Mr. Russell. "Then, I&O leaders should examine the available solutions starting with one component, process or software-defined domain that can benefit. The final stage is to plan a roadmap to full deployment if and when SDDC solutions are appropriate."

Moreover, I&O leaders must realize that the technology is still nascent. Even the more established software-defined areas like networking and storage are still gelling and are experiencing early stage adoption levels. Implementing in phases is recommended, once it's been established that the solutions in the market deliver enough functionality, interoperability and production-proven deployment history to be viable. "Storage can be a compelling starting point as the capabilities often stack up favorably against traditional solutions," said Mr. Russell.

Beware of vendor lock-in

Open-source standards or a cloud management platform may help IT organizations to reduce vendor lock-in, but it cannot be eliminated altogether. There are also no universal standards in place for infrastructure APIs, so adopting and coding to a particular API results in a degree of lock-in. It's vital to understand the trade-offs at work and the costs of migration or exit when choosing vendors and technologies.

"Recognize that adopting an SDDC means trading a hardware lock-in for a software lock-in," Russell concluded. "Choose the most appropriate kind of lock-in consciously and with all the facts at hand."

The Latest

March 23, 2017

Mature development organizations ensure automated security is woven into their DevOps practice, early, everywhere, and at scale, according to Sonatype's 2017 DevSecOps Community Survey ...

March 21, 2017

When it comes to food, we all know what's considered "good" and what's "bad". We can all understand this simple rule when eating. But for many, when it comes to software development, simple rules and advice from nutritional labels aren't always there for us ...

March 20, 2017

Monitoring and understanding what software is really doing, and maintaining good levels of software quality is increasingly important to software vendors today. Even a minor bug is capable of shutting down whole systems, and there is a real risk that development cycle pressure competes with quality assurance best practices ...

March 16, 2017

More than half (54 percent) of IT professionals surveyed indicate they have no access to self-service infrastructure, according to a new DevOps survey of 2,000 IT industry executives by Quali.This means that more than half of respondents take a ticket-based approach to infrastructure delivery, impacting productivity and increasing time to market ...

March 15, 2017

Driven by the adoption of cloud and modernization of application architectures, DevOps practices are fast gaining ground in companies that are interested in moving fast – with software eating everything - between "write code and throw it across the wall" to creating more pragmatic mechanisms that induce and maintain operational rigor. The intent behind DevOps (and DevSecOps) is quite noble and excellent in theory. Where it breaks down is in practice ...

March 13, 2017

There might be many people across organizations who claim that they’re using a DevOps approach, but often times, the “best practices” they’re using don’t align with DevOps methodologies. They can say what they do is “DevOps”, but what we’ve found is that many are actually not following basic agile methodology principles, and that’s not DevOps ...

March 09, 2017

The velocity and complexity of software delivery continues to increase as businesses adapt to new economic conditions. Optimizing and automating your deployment pipelines will dramatically reduce your lead times and enable you to deliver software faster and with better quality. Here are three more most common areas that generate the longest lead times ...

March 08, 2017

Every enterprise IT organization is unique in that it will have different bottlenecks and constraints in its deployment pipelines. With that being said, there are some common problem areas that typically produce the longest lead times in your software delivery process. Here are the most common areas that generate the longest lead times ...

March 06, 2017

The findings of an independent survey of IT leaders, application developers and database administrators, conducted by IDG Research for Datical, indicate that database administrators are unable to keep up with the pace and frequency of database changes caused by the accelerated pace of application releases, thus creating a bottleneck and delaying digital transformation initiatives. An overwhelming number of databases administrators (91 percent) and application development managers (90 percent) cited database updates as the cause for application release delays ...

March 02, 2017

A "Boost Caboose" is a secondary engine pulled on a trailer for the explicit purpose of increasing the output of the primary engine. In many ways, DevOps is its own form of Boost Caboose for application of agile methodologies within the modern software factory and SDLC/ADLC processes ...

Share this