Home Expert View The future of IT infrastructure
The future of IT infrastructure

The future of IT infrastructure

Christian Putz - Pure Storage
It’s clear that software defined is the future of infrastructure components, says Christian Putz.

Guest written by Christian Putz, Director, EEMEA, Pure Storage

Thanks in large part to server virtualization in the data centre, Software Defined Storage (SDS) has gained strong momentum across the IT market over the past few years. In fact, according to a recent Gartner report, by 2019, 50% of existing storage arrange products will be available as ‘software only’ versions, up from 15% in 2016. Furthermore, approximately 30% of the global storage array capacity installed in enterprise data centers will be deployed with SDS or hyperconverged integrated system architectures (up from less than 5% in 2016).

SDS offers a simpler approach to traditional data storage because the software that controls the storage-related capabilities is separate from the physical storage hardware. This reduced complexity means hardware no longer needs to be custom made. Innovation is not only tied to the manufacturing of hardware components, but also borne out of software development, which is more agile, reduces development cycles and has a quicker time to market.

For business users who rely on IT infrastructure, the storage element of “software-defined” enables greater levels of responsiveness and agility. Customers want greater flexibility with their storage, from the physical footprint to simplification of deployment and ongoing management. So, removing the complexity from the hardware means we can also simplify the software.

A good example is the way data is protected on your standard hard drive. Having a single physical disk means that if there is a mechanical failure of that disk, your data is lost. Using a Redundant Array Independent Disks (RAID) protects data by having multiple copies of the data spread across several physical disks, along with parity checking which is used to recreate the data in the event of a disk failure. Traditional storage solutions utilize ‘hot spare’ disks that sit idle while waiting for a failure to occur, to rebuild the lost data on the disk, using the parity information.

Instead of providing availability at the physical disk layer, a more efficient and reliable storage method is via hardware like Solid State Drives (SSDs). Being much less prone to failure allows for RAID to be abstracted from the physical disk into segments (that include parity) which are spread across multiple SSDs within the storage array. The key benefit is that in the event of an SSD failure, only the data in use on the SSD is rebuilt from parity in minutes as opposed to the whole physical disk that can take days.

Providing these capabilities in software means vendors can be much more agile in how they offer new features to customers. The ability to complete that upgrade non-disruptively in business hours, and on a repeatable basis helps build confidence within a customer. This process takes time and resources, but software defined means skilled support staff can monitor the change and any back-up or maintenance requirements remotely. This reduces operational costs for staff, who no longer need to work weekends or evenings to complete storage maintenance.

Another important benefit of software defined is that changing the components comprising the storage solution doesn’t impact the system’s availability. You can start your investment small and grow into it over time, without any disruption to the business. This creates a subscription style model where customers only buy what they need, as they need it. As requirements and capacity planning forecasts change, you can introduce additional components to scale capacity or performance independently.

Gone are the days of sizing a solution for three to five years and building as much scale into the configuration from the beginning, to get the best price from a vendor. This process has traditionally been fraught with uncertainties—is the solution sized correctly for the performance/capacity I need for the lifecycle of these assets? Did the architects take my unknown changing business requirements into consideration when sizing the solution? Will I be required to pay a huge ongoing maintenance fee to keep the solution supported after the warranty expires? What if I want to push the asset beyond its intended lifecycle? What if that lifecycle could be ten years instead of five? Will I be forced to repurchase the solution again and have to repeat the whole process?

Software defined means you are able to change every component in the storage solution non-disruptively, without any impact to the availability or performance of production applications. When new storage technologies are introduced—for example, NVMe—they can easily be integrated into the existing solution. You won’t be required to upgrade to storage product 2.0 and repurchase storage capacity already owned, not to mention the skills, resources and financial investment required to migrate data to the new platform.

Not only does SDS prove valuable for your IT team by saving time that can be redeployed back into the business but more importantly, it supports the overall growth of your organization. It’s clear that software defined is the future of infrastructure components and we haven’t even started looking at how this impacts orchestration/automation – that’s the next story!



Do NOT follow this link or you will be banned from the site!