The Storage Evolution
The way enterprises approach storage and backup solutions is undergoing a transformation, with unified solutions, deduplication, flash drives and cloud computing set to make a big impact
By Heena Jhingan
If a server breaks down, it means a short period of downtime, but when storage goes down, a CIO dreads the worst and fears losing the data assets. This is precisely why the rapid data proliferation has been giving IT heads everywhere the heebie-jeebies when it comes to storing and managing structured as well as unstructured data.
Massive explosion of unstructured data ,coupled with emergence of open hybrid cloud deployments is keeping the CIOs on toes. While virtualization is driving down server costs, making their business more nimble by making it easy to add new applications, it conversely is the reason behind the ensuing data tsunami. The simplicity of adding more applications has helped enterprises deploy applications that generate higher volumes of data. These high volumes are in turn driving up an enterprise’s storage costs and the complexity of its infrastructure.
Data is being churned out and consumed by humans, social media and machines at an unprecedented pace. Industry reports expect data to grow at about 30-50% year-on-year. According to a survey conducted by Microsoft across global enterprises, about 62% of the respondents store at least 100TB of data, and nearly a third of the respondents expect the amount of data they store to double in the next two to three years.
IDC forecasts over 102 exabytes (EB) of external and over 36EB of internal storage system capacity will be shipped in 2017 and most of this growth will be driven by APAC and other emerging markets.
The Indian CIOs have had their share of storage related dilemmas. And they haven’t yet been able to completely solve the big mystery around what to store and for how long. Enterprise storage has evolved from tape to traditional hard disk drives (HDD), to more sophisticated forms like solid state drives (SSD) and flash storage. While both vendors and IT heads seek solutions to storage strategy, industry experts suggest an easy approach to the storage puzzle can be four –fold based on volume, velocity, variety of data and value from data that CIOs plan to get.
Enterprises will need to focus not only on the primary, but on the secondary data as well that is generated through duplication and backups. With this, the total cost of ownership (TCO) for storage will go through a major change as operational costs will come down and capital costs begin to creep up. More so, as Barun Lala, Director-Storage, HP India puts it, of the total storage market, about 10% is NAS and the balance is other storage and backup.
Amit Luthra, National Manager for Storage Solution Marketing, Dell says, in India, most enterprises are second or third time storage buyers. In that sense, they are quite mature and have a clarity on their requirements.
He says, “Earlier, backup was meant only for procedural layers and did not cover test and development layers. However, enterprises have realized the importance of backup right from the first level and that calls for storage capex.”
In all, experts believe that some broad trends will rewrite the fate of the enterprise storage market in India. Let us look at them.
Breaking the IT silos
A concept that is not unique to Indian enterprises is the fact that most enterprises have heterogeneous storage environments. The applications run in silos with different applications like ERP, CRM etc., running on separate storage boxes. While some CIOs argue that these silos are critical for application performance, industry veterans believe such disparate systems instead make the environment more complex.
Vijayant Rai, Director- Sales, India & SAARC, Data Management, CA Technologies agrees. He says it is not just about complexity. “A siloed approach instead comes out to be a more expensive option as this means huge requirement of hardware, which needs capex. The enterprises also end up paying more on maintenance for each of these vendors. It also calls for investments in space and skill to manage these systems.”
A unified approach to procuring, provisioning, and managing enterprise data, is thus the logical way forward for the enterprises. Breaking the silos of block, file, and data, also means a shift in the way NAS (file-level computer data storage connected to a computer network providing data access to a heterogeneous group of clients), SAN (a dedicated network that provides access to consolidated, block level data storage) were looked at.
Solutions agnostic to the type of data, such as files, objects, blocks, and semi-structured or unstructured data are increasingly getting picked up, thanks to virtualization and consolidation. Rai believes that, by implementing unified solutions, organizations will start to realize the benefits, both in the form of reduced expenses and increased service levels to their end users.
Talking about running applications on disparate systems, Surajit Sen, Country Manager-Backup Recovery Systems (BRS) division, EMC India points out that all applications vendors now offer their own backup. For example, in a set up, Oracle DBA (database administrator) allows its own backup. Similarly, VMware VDR runs its own backup, and since all of them do their own back up, disk storage becomes high. “An emerging trend that we see in this situation is these vendors using backup as a service, in-house. So, the solution providers basically need to do the software layer and analytics and plug into the enterprise’s storage system, be it at its data center or in private/public cloud,” he says.
“Enterprises who are already stuck with silo storage structures have two choices — they can either virtualize and consolidate on a unified storage platform, by which they can continue to use the capacity from different vendors, but manage it through a single console; or they can use technologies like our Open Systems SnapVault that offer data protection and backup to open and mixed storage platforms,” explains Santhosh D’Souza, Director- Systems Engineer, NetApp.
Dedupe, the smarter way
Over the past few years, enterprises have tried to tame the data deluge with strategies like thin provisioning, tiered approach and deduplication technologies.
Sen of EMC sums up the enterprise storage metamorphosis saying that traditionally tapes were used to store data, but they were not a reliable form of storage media, so when disks arrived, they were definitely more reliable but were also expensive. Deduplication has been around for a while now and is more mature. However, mere deduplication too could not solve the bigger concern of shrinking the backup window, which is almost non-existent now. Back-up now needs to be done in real time.
Tarun Kaura, Director – Technology Sales – India & SAARC, Symantec feels, data growth has obliterated the backup window. “Thousands of virtual applications create a scheduling nightmare. Enterprises are trying to figure out how to backup and recover millions of files in backup windows that no longer exist,” he says.
According to Srinivas Rao, Director – Pre-Sales and Solutions, India, Hitachi Data Systems, about 70 to 80% of enterprise data is static in nature, which keeps getting backed up in the data center like a toxic waste. Organizations direct about 40-60% of the storage spend on backup to protect data either on-site or off-site. He says, “Today, there are tools available that help differentiate between static and mobile data, and accordingly understand the kind of archival or object storage strategies the enterprises need to develop.” He further adds that an effective deduplication plan can help enterprise reduce storage cost by 60%.
To help enterprises overcome backup hassles, vendors repackaged deduplication in a smarter avatar. In general, enterprises used to implement deduplication only on secondary storage, stored preferably on disk for better Recovery Point Objective (the maximum tolerable period in which data might be lost and Recovery Time Objective (the duration of time and a service
level within which a business process must be restored after a disaster or disruption), he says.
Enterprises have now started moving to technologies that deduplicate at the source rather than at the target. Further, the target deduplication market is set to be disrupted and replaced by integrated backup appliances that combine source and target deduplication, backup software, replication, snapshots, security, and cloud integration in a single appliance.
Make way for flash
Until recently, traditional hard disk drives (HDDs) were the most preferred storage media, however, new technologies and applications that require much faster input/output operations per second have paved a natural way for flash storage and solid state drives. SSDs are designed to handle such faster speed of data traffic. IDC reports indicate that global SSD shipments are set to reach nearly 160 million units in 2015, from 22 million units at present.
Sen says that since flash technology offers greater input/output performance than the magnetic media, customers that have input/output intensive applications, even though data may not be huge can simply use flash rather than following a tiered approach.
Despite the fact that SSDs offer better performance and consume lesser power, price is still a deterrent for its growth. According to storagereview.com, cost per gigabyte on HDD is only around $0.075 (based on buying a 4TB model), while for SSD it is about $1.00/GB (on 240GB model). Even though the price of SSDs has been falling, the price per gigabyte advantage is still works in favor of HDDs.
Rao adds that CIOs isolated applications in silos due to performance scalability required by certain applications, but with flash, they don’t have to do that.
Tape – not dead yet
So, does that mean tape is heading to obsolescence? Not really. Going by the Enterprise Strategy Group Report, the capacity of archived electronic information worldwide is expected to grow to 300,000 petabytes by 2015, mostly due to the an increase in unstructured archive data. As a result, tape, in addition to external disks, will continue to be the preferred storage media for archival purposes. At present, tape commands a 38% share of the overall digital archive volumes.
For an enterprise, Rai of CA Technologies says; business continuity, app availability and regulatory compliance are compelling factors for investing in data backup and archival solutions, especially for verticals like BFSI and telecom. Most CIOs are therefore now insisting on incorporating archival and data recovery as a part of backup delivery and implementation.
Kaura shares a similar opinion that constantly evolving compliance regulations are forcing tighter recovery SLAs and requirements for more comprehensive disaster recovery solutions.
Open and software defined
Unlike in the past, storage industry is no more just a hardware driven market. Software and application play is getting equally important.
Vendors like Oracle are trying to innovate by engineering application and hardware together. Amit Malhotra, Senior Director – Storage Sales, Japan & APAC, Oracle, points out, “Horizontal designs of data centers are gradually being phased out. IT managers realize that general purpose storage cannot deliver high levels of performance for each application and it is better to have application engineered hardware to get optimized performance. It will save the cost that enterprises spend on tuning the storage to a particular application. Since these solution are OpenStack, there aren’t any integrations issues.”
A greater push is towards vendors decoupling their monolithic proprietary hardware and software layer. Lala of HP stresses that the industry is headed towards rapid commoditization and standardization at the hardware level combined with increased intelligence at the software layer. This makes all the more sense for the enterprises embracing virtualization, where the software forms the orchestration layer. “Software stacks help manage heterogeneous environments without compromising on scalability and flexibility,” he reasons.
Interestingly, a slow yet steady, trend in the making is the software being written outside the walls of vendor. Vendors like Riverbed expect enterprises to gravitate toward the open source approach to solving storage challenges.
D’Souza agrees that CIOs are demanding hybrid environments where data exchange on premise and the cloud happens seamlessly. “In such cases, all the APIs become a part of the set up and they see less lock-in. We too are working with the open standards community to help enterprises manage open hybrid cloud environments.”
Storage or cloud administrators
It is now a given fact that corporate data now may reside in service provider data centers, and not on premise. The role of a typical storage administrator is evolving from managing the storage bit of the organization’s IT infrastructure. While the private cloud allows the administrators some control, the public cloud mandates that the storage is deployed outside an enterprise’s premise and it may offer no or restricted control. Experts believe in future, storage administrators will have to be prepared for scenarios where storage is without boundaries.
“Developing that skill set in an organization is a challenge, but the software platforms have made things look a bit simpler,” Lala says, adding that new architectures require the administrators to be more equipped to manage other aspects of the infrastructure, including virtualization, networking, and business continuity planning as well.
Data security being a priority for any CIO, encryption of data is another trend gaining acceptance. Vendor-level encryption alone is not enough now and IT heads find it better to locally encrypt data prior to transferring it to the cloud. Topping it up with the service provider’s encryption works as second level of security.
Backup forms a key component of enterprise storage. Luthra of Dell says, the fact is that when CIOs think storage, they don’t think of backup alone. Enterprises look at backup for data protection, business continuity and compliance. Nevertheless, it will always be a tightrope walk for IT managers to keep storage small and low-cost — yet highly available — while data continues to bloat. A formula that will work in most cases is to have a clear mission while designing the storage plan, have blended strategies and most importantly, assign rupee value to data.