Software-defined infrastructure promises flexibility and agility in the data center, but many IT pros still struggle with challenges such as cost concerns and implementation issues.
The software-defined data center (SDDC) aims to decouple hardware from software and automate networking, compute and storage resources through a centralized software platform. IT can either implement this type of data center in increments by deploying software-defined networking, storage and compute separately, or in one fell swoop. IT pros at Gartner’s data center conference this month in Las Vegas said their organizations are interested in SDDC to address changing storage needs.
In the beginning of SDDC’s foray into the IT landscape, IT pros generally used software-defined infrastructure for one application or region. But in the past 18 months or so, more organizations are expanding the use of software-defined from one application to everywhere, as general-purpose infrastructure, said Daniel Bowers, research director at Gartner.
“That’s a shift,” he said. “That means software-defined is going from a niche technology — great for certain applications — to the mainstream.”
As interest levels increase, adoption in the software-defined data center market is on the rise. By 2023, 85% of large global enterprises will require the programmatic capabilities of an SDDC, as opposed to 25% today, according to Gartner.
Some IT teams are evaluating the software-defined data center market as their higher-ups demand innovation, including one financial services company.
“Our CIO is increasingly demanding to move in a software-defined direction,” said an infrastructure architect at the company, who requested anonymity because he was not authorized to speak with the media.
The company’s IT strategy is to shift away from a traditional, scale-up, monolithic storage model and toward a scale-out storage model, which enables IT to buy more storage in smaller chunks. The company also aims to update its “big, flat network” through software-defined networking’s automation and orchestration capabilities, the infrastructure architect said.
Currently, the company’s IT department struggles to deliver adequate test environments to its developers. It aims to close those gaps by spinning up an entire test environment through APIs. When developers are finished testing, they can spin it down, rinse and repeat.
A software-defined data center is a perfect match for an API-driven infrastructure, the infrastructure architect said. With the click of a few buttons, programmers can provision the temporary development environments they need to build applications.
For others, software-defined infrastructure is a secondary solution to an accidental problem. Wayne Morse, a network administrator and systems analyst at Jacobs Technology, an IT services company based in Dallas, runs local storage across 24 servers.
“The problem is, we’re running out of disk space on any individual server, and we need to share those resources across multiple servers,” he said.
IT didn’t implement a SAN due to cost issues, Morse said. Now, the company needs distributed storage across the data center to share resources — and software-defined storage (SDS) is a way to achieve that.
But one of the most significant advantages of an SDDC — the ability to implement it gradually — can also be its biggest downfall.
“[Software-defined storage] needs to be a part of a bigger picture,” said Julia Palmer, a research director at Gartner. “It’s very difficult, because all of the components of software-defined are developed separately.”
For Morse, that means a limited network could hinder the capabilities of SDS. He is considering upgrading the company’s network to fully take advantage of the SDS’ storage-sharing features.
Other organizations see the advantages of software-defined, but costs keep actual adoption just out of reach.
Walt Baineydirector of infrastructure operations, Kent State University
Walt Bainey, the director of infrastructure operations at Kent State University in Kent, Ohio, has looked at the software-defined data center market for years, but only from afar. That’s because his IT team doesn’t roll out a lot of compute storage or make constant changes to the network.
“We are more static,” Bainey said. “The costs of implementing and purchasing the products to make [an SDDC] happen are greater than the actual need.”
Still, one ideal use case for SDDC would be in the university’s research computing cluster, which provides the infrastructure that supports research needs of professors, researchers and students. There, the IT team could license a smaller footprint of hardware, software and networking components to cut costs, Bainey said. Through software and scripts, IT can provide resources such as servers and file shares and automate routine tasks to build out the environment’s compute, storage and networking components.
“We could have our faculty members and professors self-serve and dole out things they want by spinning them up and spinning them down,” Bainey said. “I think there’s a huge advantage in that type of scenario, but we’re not there yet.”