Providers face rising storage demand in the cloud data center

Enterprise data capacity needs have expanded considerably as more information has become digitalized and companies invest in analytics initiatives. This movement has raised the value of robust storage solutions within both on-premise data centers and cloud infrastructure. Data Center Knowledge contributor Bill Kleyman recently explored several of the factors influencing storage within cloud environments. 

One of the factors all data center operators must consider is what type of storage devices to use. The industry's push toward high speed and performance may make solid-state drives seem like an obvious choice. However, as Kleyman noted, deploying SSDs for every use case would be unnecessarily costly. Instead, companies should identify where flash technology would produce the most value – in scenarios such as database processing where a high input/output operations per second is necessary. 

Prioritizing storage environments based on performance need is even more necessary for cloud providers, as they must meet the needs of a wide range of businesses, not all of which would need or have the resources for a system built entirely with SSDs. 

Cloud storage companies are also looking to the data itself to drive efficiency further through strategies, such as deduplication. Kleyman used the example of a storage array that would normally store 100 20mb attachments. Utilizing deduplication, the array would be able to store a single file and create pointers to the one unique instance of that data. This enables organizations to better manage data sprawl, but it does require a solution intelligent enough to adjust the pointers so that changes to the stored file are reflected. 

Although effectively leveraging these strategies alongside robust hardware can help cloud providers manage their increasing data footprint, rising capacity and performance demands are likely to counteract much of the efficiency that is gained. This places a premium on driving innovation even further by incorporating robust hardware with best-in-breed, software-level solutions that can effectively manage data as it comes into the cloud data center.

"Because cloud computing will only continue to advance, there will be new demands placed around storage," Kleyman wrote. "Even now, conversations around big data and storage are already heating up. Taking the conversation even further, new types of big data file systems make the big data management process even easier. Working with the same open-source product, the Hadoop Distributed File System (HDFS) has taken distributed big data management to a whole new level."

Preparing for the future
One of the reasons many customers turn to the cloud is that it enables them to better handle unexpected changes. That scalability often prevents performance problems and outages that would otherwise emerge from rapid business growth or sudden usage spikes. However, cloud providers must make the same consideration as they build their infrastructure. 

Many providers have turned to open source as a way to achieve higher performance without making their offerings more expensive. Nxtgen Datacenter and Cloud Services recently showcased the potential savings when it deployed new services powered by a virtualization platform built on OpenStack. CRN reported that the company saved approximately $4,000 per appliance as a direct result of leveraging open technology.

"OpenStack consists of a series of inter-related projects that control large pools of processing, storage and networking resources throughout a data center, all managed through a dashboard which gives administrators control while allowing users to self provision resources through a web interface," the article stated.

NxtGen's configuration includes 160 physical processor cores and four TB of RAM that is used across 25 servers. This infrastructure is used for streaming video and video gaming, which CRN noted uses 850 MB of bandwidth during the busiest time periods.

2013-04-05T10:05:34+00:00

About the Author: