Why data centres are hot (in a good way)


By GovTechReview Staff
Tuesday, 21 December, 2010


Data Centre

Data centres are sexy again.

Once an afterthought in a broom cupboard at the end of the corridor or, worse still, a collection of servers under people’s desks, data centres have come a long way.

Today, many government agencies own, share or rent purpose-built operations that promise security, integrity and very, very high availability. Their spending on those facilities is so significant that the Federal Government hired Sir Peter Gershon to, in part, consider best practice in their operation.


More recently, the sector has enjoyed renewed interest, due in part to a growing shortage of data centres in Australia and to climate change awareness. Agencies are now under pressure to deliver facilities that produce less carbon emissions, use greener power sources and are, in general, more efficient all around.

But it is the shift from location-based centres to network-based centres that is capturing the imagination of the industry at the moment.

Analysts predict carriers and Internet service providers (ISPs) have the greatest potential to attract data centre customers, because these service providers are already part of the equation – through the provision of communication links – and have made substantial capital investments in IT infrastructure and security, which they can now share. They will do this both by offering to house clients’ data centres or simply, their data, through cloud services.

Research from analyst firm Ovum predicts that as cloud computing gains momentum, government buyers will insist on keeping data onshore, within their legal and regulatory jurisdictions.

Its analysts say domestic carriers will therefore position themselves as the dominant providers of enterprise-grade services, differentiating themselves on the basis of keeping data within country, their operational scale and their ability to manage end-to-end service.

To this end, Telstra recently partnered with Accenture to offer a new cloud-based model called Network Computing Services, which it will offer to large enterprises and government clients. Both Visy Industries and Komatsu Australia have moved their SAP systems to the new service.

Ovum Research Director Kevin Noonan says it begs the question of whether large data centres are still needed.

“Can the cloud itself provide solutions?” he asks. “This is where the telcos can play a big part, because they have availability at close to 100 per cent. Cloud providers are saying the data centre is the network. You can have it located anywhere, but it’s the network that matters.”

iiNet’s General Manager Business Division, Steve Harley, agrees.

The company has up to 50 racks in Perth and 100 in Melbourne available to allocate to clients. It already serves Victorian agencies including public schools and Victoria Police, as well as Melbourne Health.

“The trend is already happening. Telcos are different to data centre providers, because they can offer an end-to-end solution. Data centre providers find it extremely hard to diagnose faults. They have to deal with others to determine the problem. The (availability) challenges are no different to us, but it’s end to end. You only deal with one supplier,” says Harley.

Optus has played in the data centre sector for a long time through subsidiary Alphawest, but now also offers co-location and cloud hosting facilities.

Andrew Vranjes, cloud and data centre specialist at Optus Business, says cloud is emerging as the solution for clients that want to reduce their data centre footprint.

“We have the physical data centre assets, we have the management capabilities, the network services, the billing systems, the SLAs. We have all the building blocks,” says Vranjes, who adds certifications to the mix.

“The international standard ISO 270001 assures government departments they are dealing with a very mature organisation. That bar of certification is quite difficult for government departments or other organisations to achieve.”

Location, location, location

While the cloud is promoted as a potent idea, some agencies need to be a hop, a skip and a jump from their IT operations.

Proof of the need for physical proximity can be found in the fact that the Federal and NSW State Governments are both currently tendering for large data centres to cater for their needs in the next 10 to 15 years, says Noonan.

“All of a sudden, they’ve gone down the path of needing bigger data centres and that’s just for the current and projected growth. It’s the old paradigm. The centres are location-based and power hungry,” he says.

So, could governments bypass these giant power eaters and go directly to the cloud?

“The whole story is not a black-and-white decision. It’s a very big issue,” he says, adding that the urgent need to accommodate current data growth comes first.

This has worked well for one data centre operator, who identified the nation’s capital as a hot prospect for a high-security shared facility.

Canberra Data Centres (CDC) is an Australian-owned host company started in 2007 to serve power-hungry government data needs.

It counts among its clients the Australian Intelligence Community – the group consisting of the Office of National Assessments, the Australian Security Intelligence Organisation (ASIO), the Australian Secret Intelligence Service, the Defence Signals Directorate, the Defence Intelligence Organisation and the Defence Imagery and Geospatial Organisation.

“There has been a massive explosion in ICT computing requirement in the Federal Government in the last five years,” said Greg Boorer, CDC Managing Director.

“Government departments are deploying blade servers and high-density equipment in their data centres, which produce more heat and require more power. They require 10-30KW per rack and their existing data centre infrastructure can’t cope with current loads, let alone future IT loads.”

Power alternatives

The company that built the CDC infrastructure, APC, says governments are leading the charge when it comes to energy efficiency and energy responsibility.

APC Vice-President for Asia Pacific, Gordon Makryllos, says the CDC was built to deliver “real business return in terms of green credentials.”

“Traditional data centres have a raised floor with cooling units on the side to pump cold air through the floor vents. But now they can’t push enough air to cool the racks so they cool the whole room into an icebox. We say that’s inefficient. We say bring (cooling) closer to the racks,” says Maryllos, using a champagne bucket as analogy.

“It’s precision cooling. By cooling the racks, we capture the hot air from the back of the rack and also contain the heat in a small area.”

APC claims it can save 50 per cent of energy costs using this method, as well as savings in real estate by bringing the racks closer together.

“If you are using half of the energy in running your IP load, you’re saving double the energy at the generation point,” Makryllos says.

But another cooling alternative is also gaining favour: free cooling through outside air.

The term ‘free’ is misleading, because of the investment in fans and fall-over infrastructure when the outside air is not of a optimal temperature or humidity level. But still, the idea that a data centre can function using natural outside air pumping through its facility to help cool equipment down and take hot air outside is exciting.

Data CentresTexas-based Ty Schmitt, Principal Thermal/Mechanical Architect for Dell Data Centre Solutions, says most of the hardware used today has its own built-in fans to cool the system down.

“The faster they spin and the more equipment you have, the cooler it needs to be. Then the faster the fans, the more power it consumes. In some cases, they are consuming more per rack, because the increase of consumption is exponential.”

The alternative, Smith suggests, is containment, where possible coupled with outside air, which Dell has just launched in Australia through the new Tier5 modular data centre housed in a disused Mitsubishi Motors factory.

“We put racks into containers or modules so I contain the exhaust air in the hot aisle and that air can’t go back around the front to the servers. It’s exhausted out.” (see diagram)

Modularity

“There’s opportunity for data centres to break the mould,” says Marty Gauvin, Founder of Tier5. “It’s flexible and I don’t have to build it all at once. It suits different needs, uses less power and has more redundancy.”

Sounds too good to be true?

Time will tell. The new Tier5 facility is a first for the newly-formed company by former Hostworks executives, and for Dell, which refined its container offer to free the modules from its end walls.

The modules are the width and height of traditional shipping containers (making them easy to manage), but open-ended in length so they can be easily manoeuvred and extended. Modules can be delivered to clients’ premises for testing, before being shipped to the facility.

“The traditional data centre is a room that works the same way, whether it’s in Sydney or the Sahara. Here we can use the outside air for about 200 days of the year,” says Gauvin.

Air is drawn and filtered from the outside through vents on the module’s end wall by large fans, while hot air from the racks is pumped out. Each module has its own power, IT (1,000 to 1,500 servers) and cooling units. Cooling is done by water chillers on hotter days.

He says by cooling the module and not the room, energy savings of 50 per cent can be achieved.

“We then further halve the carbon emission by using natural gas for power generation.”

Security

Nick Race of Arbor Networks, which provides security against Distributed Denial of Service (DDoS) attacks for major ISPs, says security is the main concern for clients considering co-location and cloud services.

“This idea of cross-attack within the data centre itself is a very real threat,” Race says.

“Co-location providers almost have a duty of care to ensure there is no cross-contamination through service level agreements.”

He advises customers to ask the tough security questions of their potential providers.

“Having firewalls is not enough and they may be really good at physical security, but when it comes to electronic security, they need to prove it.”

Tim Smith, Director, Bridge Point, a specialist in information security and compliance, says the question to ask is: ‘how are virtual machines secured against each other?’.

“It’s an architecture discussion. What mechanisms do they have to prevent a compromised virtual machine having a domino effect?”

Providers that follow The Open Group Architecture Framework (TOGAF) show the most maturity in terms of security, Smith says.

“If you get a ‘yes’, the next question is SABSA (Sherwood Applied Business Security Architecture). That’s the nirvana.

At the very least, customers should engage an auditor to determine if a provider can truly deliver mission-

critical security.

“More mature providers will comply with the ISO 27001 standard and with the PCI DDS (credit card payment standard) and Defence Signal Directorate standards for physical security. They are excellent starting points,” Smith adds.

Downtime

In an ideal world, data centres are available 24/7 and never fail. In the real world, the picture is different.

With the adoption of virtualisation and cloud technologies, previously good practice is slipping by. Backup and recovery provisions are lacking according to the latest annual Symantec Disaster Recovery Study.

The study found disparate physical and cloud resources are adding complexity to the protection of mission-critical data, with managers treating virtual, cloud and physical data environments differently.

Ninety-three per cent of 150 respondents in Australia and New Zealand said backup occurred only weekly or less frequently, despite organisations experiencing an average of six downtime events a year. They pointed to people, budget and space constraints as the top challenge in backing up virtual machines.

The survey also found managers are experiencing outages of six hours on average, three times the duration they say they expect.

“We are noticing an increase in the adoption of new technologies such as virtualisation and cloud computing, with the aim of realising cost savings and enhancing disaster recovery efforts. However, businesses have not yet mastered the art of managing data across these environments, leaving mission critical applications and data unprotected,” says Paul Lancaster, Director of Systems Engineering, Pacific, Symantec.

“We recommend that organisations adopt tools that provide a comprehensive solution with a consistent set of policies across all environments. Data centre managers should simplify and standardise so they can focus on best practices that help reduce downtime.”

Related Articles

Reimagining policymaking with GenAI

Generative AI offers a new approach to government policymaking and management, providing more...

The evolution of case management

Case management is a type of work that requires gathering and analysing data from multiple...

Something very big will break unless we modernise our IT infrastructure

The current focus on application modernisation, cloud platforms, data and analytics has resulted...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd