Why do we have so much data—and what are we going to do with it?
Maybe I’m revealing my age, but I remember the days very well when storage space was at a premium and companies had not quite learned how to manage growing assets long-term. Keeping data was expensive. The IT team played gatekeeper, frequently ordering the deletion of months-old business files and emails in order to make room for more current content. I used to keep a floppy drive to store the important content I really needed, and relegated everything else to the data graveyard by clicking the delete key repeatedly and watching it disappear. Forever.
That was storage during the first generation of IT. In those days, we couldn’t have imagined that 14PB of data even existed, much less that it could be stored in a single data center rack.
The data we generated in those days was generally structured business data—documents, emails, and spreadsheets. We spoke in gigabytes in those days. Unstructured data like graphics and photos were stored on a separate server that could manage their size (which was big for the time even then). But these dedicated servers had to be wiped frequently and data was exiled to tapes that piled up in storage closets.
Times have certainly changed and data just keeps increasing as technology advances. This makes storing data complex because the target is constantly moving.
Why? Because most of us didn’t see this coming. Email. Handheld computers masquerading as phones. Higher and higher resolution cameras taking billions of pictures. Security systems. Complex numerical data from remote sensing devices. And IoT-connected refrigerators and cars. The list goes on.
Enterprise storage infrastructure was not built for the data capacity of today
During the second and third generations of IT, when data continued to pile up beyond what simple storage solutions could handle, large businesses—and eventually even small ones—began storing more data. They built their own data centers or began moving to the cloud.
Why? They realized data had business value, and that data actually drove much of their business, whether from e-commerce or software applications. It was no longer acceptable to throw data away, so they invested in expensive capital equipment to keep it and serve it to employees and customers.
But most on-premises data centers had built-in limits that were imposed by the very design of the clunky hardware that was rigid and not flexible for scaling up or integration of technical upgrades. Data was still a means to an end and finite in the mind of the data center designers. When the influx of data became more than most companies could manage, they sought the shelter of public clouds, which are really just enormous data centers where someone else takes on the burden of securing the building space, frequent hardware upgrades, heating and cooling costs, and security concerns.
It was a match made in heaven. Enterprises in certain industries with stringent rules and regulations had to be more concerned with security, so they built their own private clouds. But the hardware was in bulk, and was not flexible or scalable.
This is not your grandfather’s data storage
The ways data is created and the ways it is used will continue to shift. Enterprises hopping on the IoT train need a storage solution that can flex with those changes. And, today’s IoT-connected devices produce a lot of data.
The most forward-thinking businesses have realized there is no such thing as a one-size-fits-all solution. But there is a one-size-fits-now-and-later solution.
Companies are using robots to secure and surveil buildings, to drive cars, and to work on manufacturing floors. They’re deploying drones to mitigate crop loss, to identify anomalies in a power grid (like downed power lines), and to deliver packages. This is IT 4.0.
The media and entertainment industry has been blazing the way to IT 4.0 for years and consequently worked out the kinks for the rest of the industries. M&E was one of the first power users of data in huge amounts from high-resolution cameras collecting hours and hours of footage that had to be accessible to many users simultaneously. They couldn’t store their data in the cloud—not only was that financially restrictive, but it was far too slow. The cloud is physically too far away from real-time editing workstations and the latency of data transfer from storage to desktop would make pre- and post-production work virtually impossible.
What we learned from our customers in the media business—and others who have been harnessing data’s deluge—is that their needs for data management infrastructure boil down to three fundamental requirements:
- All-in-one value
- The ability to leverage data—whether for business insights or derived content
In response, Seagate recently announced its biggest-ever modular storage device. The Seagate Exos E 4U106 holds a whopping nearly 1.5 PB of data in a box. Stack ten of those in a 42U data center rack and the potential is nearly 15 PB of data storage. Built to adapt to both micro modular edge data centers or traditional cloud environments, this stackable unit meets capacity needs in any data management use case.
It’s all about your business, not about us
How did we do it? Getting a keen understanding how data is used was and is the name of the game. To deliver solutions that enterprises need in a business environment that is more competitive than ever, Seagate didn’t focus on just one aspect of data flow technology. We spent years researching and understanding enterprise data workflows and applying that knowledge to engineering and testing the individual, integral parts of a comprehensive data movement system. We broke down the wall between consumer expectations and enterprise expectations by listening to customers who told us they wanted whole solutions that solved problems, not pieces and parts they had to put together themselves.
We heard they wanted the same efficiencies they enjoy in their personal lives to apply to their business operations. We evolved our component manufacturing processes to leverage the most modern tools available, including machine learning and AI, to ensure top-tier quality at every step. And we tested in real-world scenarios so that we could be sure everything would work flawlessly together.
And finally, we considered the future of enterprises and how they would need to grow: We built in modularity that enables easy add-on capacity and the introduction of new technology features so we could future-proof businesses and make heavy-lift infrastructure upgrades a thing of the past. The result: all-in-one systems that no longer require time-consuming IT intervention and constant tweaking to make them play nicely together.
How can enterprises prepare for tomorrow, today?
The trick is to learn from our past oversights. We can’t predict the amount of data we’ll have to store, but it’s bound to be even bigger than we think. I’ll put my bet down right now that it’s certainly not going to be smaller than we think. According to IDC in their study Data Age 2025, data will grow to more than 163 zettabytes (that’s 163 trillion gigabytes) in just 7 short years, an astounding amount of data.
It is not realistic to think that companies can rip and replace their infrastructure every few years to keep up with technological advances.
That’s why Seagate delivers out-of-the box, affordable, stackable, easily upgraded storage solutions. We don’t want you locked into one capacity number. We regularly release higher and higher capacity drives that power building blocks, making your on-prem data center a thing of the future.
Check out a 3D video of the all-in-one Exos E 4U106 plus our other high capacity modular storage solutions.