SEPATON Forecasts Increased Data Center Complexity for 2011

Storage professionals seek to leverage advances in new technologies while simplifying management

For large enterprises, data protection is becoming exponentially more complicated every year. Storage administrators are grappling to stay ahead of the explosion in data volume from inside and outside the corporate data center while delivering ever faster backup and recovery times. They are adopting new technologies, such as advanced deduplication, disk-based backup solutions, and 10 Gb Ethernet network protocol, that promise to address their data protection challenges. As a result, many enterprise data centers have become exponentially more complex in the past few years.

“Many enterprises have multiple backup application environments, dozens of discrete backup targets several networking protocols, and a variety of disaster protection technologies to manage,” said Dennis Rolland, director of advanced technology, Office of the CTO, SEPATON, Inc. “Without careful planning, the benefits of new technologies can be offset quickly by the cost and risk of dramatically increased complexity.”

According to Rolland, there are several key factors driving the complexity of data protection. These include:

Data Center Sprawl
Many data centers introduced disk-based backup solutions to their backup environments to improve backup and restore times. However, without grid scalability, enterprise-class deduplication, and massive single-system capacity, these systems can quickly proliferate, causing data center “sprawl”. In these environments, data protection is divided onto numerous individual systems that have to be managed, deduplicated, and tuned individually -- an intrinsically inefficient process. They need to add an entire system every time they need more performance, more capacity, or more deduplication efficiency.

Multi-protocol Choices
Many enterprise data centers want to use 10 Gb Ethernet to improve performance and take advantage of backup applications such as NetBackup with OST. To avoid the cost and complexity of moving the entire data center to 10 Gb Ethernet at once or dividing their backup volume among different disparate solutions, data center manager should consider a phased approach that uses a storage pooling on a single system. This approach reduces complexity and enables a “best of breed” technology adoption.

Cloud Computing and Data Center Consolidation
As large companies increasingly look to private cloud models to consolidate and streamline their data centers, they need a powerful data protection environment that enables them to allocate performance and capacity according to business unit need, instead of physical hardware limitations. They also need to track usage and project capacity requirements on a business-unit level in a secure multi-tenancy model. Companies also need the option to seamlessly create another tier of data protection in a public cloud.

Rolland concluded, “Increased complexity in data protection is a key trend that will guide purchasing decisions in 2011, yet increased complexity is leading to increased costs with more pressure for enterprise-class data protection.”


Source : sepaton.com
Labels: Enterprise, Software

Thanks for reading SEPATON Forecasts Increased Data Center Complexity for 2011. Please share...!

0 Komentar untuk "SEPATON Forecasts Increased Data Center Complexity for 2011"
Back To Top