Big data storage needs big storage data


By GovTechReview Staff
Wednesday, 27 March, 2013


Organisations must run their numbers well to ensure they’re getting the most bang for their data-storage buck when adopting storage-intensive big-data strategies, a senior economist with storage giant Hitachi Data Systems (HDS) has warned.

Noting that organisations must consider different storage architectures depending on their data requirements, HDS chief economist David Merrill has told GTR that a quick study in the dynamics of ‘storage economics’ – HDS, for one, has identified 34 different kinds of enterprise-storage costs – can be invaluable for organisations that haven’t revisited their storage costs in a while.

“For the past few years, storage has been highly deterministic,” Merrill explains. “We can talk about the total cost, and with a mapping system can map [requirements] to solutions that are proven to reduce costs. This leads to very easy discussions with senior managers, storage directors and CIOs who know how the economics of storage behave.”DavidMerrill-HDS

Organisations trying to shovel massive volumes of big-data information into a data warehouse are likely to find their costs become unsustainable if they insist on using just expensive, high-performance disks.

The economic impact of such strategies quickly becomes obvious once they’re applied to careful storage-economics analysis – and the results can be an eye-opener. “The economic model exposes architectural mistakes that were made in building up to big data,” says Merrill, who blogs regularly on the challenges and opportunities of costing storage strategies.

“When they first do big data, organisations build proofs of concept, then keep replicating that initial design. This is fine for a test-bed, but the economics change as you build out to several hundred or thousand nodes, and several hundred terabytes of data. Storage economics helps us know when an architecture is economically unsustainable.”

Merrill also advised caution on enthusiasm over low-cost cloud-storage options now appearing in the market, such as Amazon Web Services’ Glacier low-cost storage.

Cloud storage has emerged as a favourite of many government organisations, particularly as open-data initiatives demand the dumping of large data sets into publicly-accessible forums. Yet while Glacier only costs around $0.01 per GB per month, with recovery times measured in hours Merrill warns that organisations must weigh the potential cost of those delays in terms of the business interruption they impose.

“As cloud providers they’re coming up with very seductive pricing points,” says Merrill, who has conducted extensive and detailed analyses into those economic models.

This makes cloud-storage environments unsuited for many enterprise backup and other data-intensive storage tasks, where a quick recovery is paramount. “If the cost of performance, recovery time, and recovery points cannot be sustained within the cloud, and you know the cost per hour of waiting during an outage, you can do the maths very quickly,” he explains.

“Take what the cloud vendors are saying, and apply that back into your business to figure out if those solutions are valuable. It’s about getting beyond the initial messaging of the low-price cloud offering.” – David Braue

Related Articles

Signs that the data sovereignty discussion in Australia is maturing

Data sovereignty is an important topic and is being spoken about in the public discourse more...

Five ways government agencies can modernise successfully

Modernisation in government holds layers of complexity as silos, skills shortages, a risk-averse...

A new approach to accessing citizen data

Capturing a view of a single citizen is currently a challenge... could a networked data platform...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd