Are you up to Gershon's $1b data centre challenge?
The Gershon Report targeted the cost of operating data centres. David Braue explains who will need to change their practises and how some agencies are already moving towards more efficient operations
Lesley Milburn, City of Boroondara
Lesley Milburn wasn’t entirely sure her strategy was going to work, but she had no reason to doubt it either. Now, around 18 months later, and despite initial scepticism from some of her peers, Milburn – Information Technology Manager with the City of Boroondara in suburban Melbourne – has been shifting servers as part of a radical new data centre strategy that will halve the council’s data centre costs and dramatically simplify its management burden.
That strategy is built around a broad facilities sharing agreement with the nearby City of Melbourne, in which the two councils will collocate disaster recovery (DR) application servers in each other’s data centres. Melbourne has already set up a large number of servers that now occupy around half of Boroondara’s Camberwell data centre. By 2011, Boroondara will have relocated the systems from its existing disaster recovery facility in Hawthorn, to Melbourne’s own facility. Day-to-day server management is being kept local, minimising administrative spend while maximising use of space – and the efficiencies provided by Boroondara’s newly-refurbished data centre.
In an environment where the two-pronged challenge of risk management and political leanings can obfuscate even the smallest decisions, taking such a dramatically different data-centre policy was no small suggestion. But once Milburn found a kindred spirit in the form of her City of Melbourne counterpart, CIO Geoff Brown, political concerns took a back seat to productive technological planning as the two spearheaded their data centre reinvention.
“When councils get together, they often agree there are synergies in their requirements, but there’s a bit of a mindset that they’re different enough that they couldn’t possibly [share infrastructure],” Milburn explains. “Critics said we would never be able to agree – but as determined as they were that it wasn’t going to work, we were equally determined that it would. And, in the end, it wasn’t a big deal. There were a number of things that had to be put in place, but it was more about getting the words in the agreement right. The actual transition has been quite smooth.”
Both councils expect big savings: by the time its secondary facilities are moved to Melbourne’s data centre next year, Boroondara, will be able to completely decommission its Hawthorn facility – eliminating the power, cooling, and real estate investments that have for years equalled those required by the council’s primary facility. This will immediately cut the council’s data centre expenses in half without compromising the council’s business continuity.
The new data centre mandate
The successful realisation of Boroondara and Melbourne’s shared data-centre vision reflects a new era of data centre revisitation. Organisations at every level of government are revisiting their often-groaning data-centres with an eye to improving environmental efficiency, slashing running costs, simplifying their application infrastructures, and optimising their day-to-day operation.
For years, efforts to these effects have been focused on pushing virtualisation technology into the core of the data centre: with early server virtualisation projects consolidating server functions in often-astounding numbers, virtualisation has been an easy way to pluck the low-hanging fruit from the bloated data centre tree.
In the long term, however, it may be Sir Peter Gershon that government CIOs have to thank for facilitating the rationalisation of their infrastructures. It was Gershon’s landmark 2008 review of government spending, after all, that fuelled the Department of Finance and Deregulation’s decision to pen the far-reaching Australian Government Data Centre Strategy 2010-2025.
For the government to do anything with a 15-year outlook reflects clear recognition that data centre reform is essential to containing costs, and the data centre strategy (AGDC) is no exception. As outlined by Finance and Deregulation Minister Lindsay Tanner on March 22 2010, the strategy is intended to address what has been a largely haphazard, piecemeal approach to commissioning data centres that has left the Commonwealth Government managing 30,000 m2 of data centre real estate at a cost of $850 million annually.
Gershon’s review identified data centre running costs as a key target for potential savings, recommending that the government could avoid $1 billion in data centre expenses by developing a co-ordinated data centre strategy for the next 15 years. That strategy will force departments to use a government-managed panel of approved data centre suppliers, impose a minimum data centre floor space commitment of 500 m2 to minimise unnecessary facilities duplication, and specify a minimum lease period of 10 years in a bid to foster longer-term data centre planning.
A catalyst for change
In some ways, the AGDC was particularly well-timed. A survey of the 103 agencies guided by the Financial Management and Accountability (FMA) Act, all of whom will be governed by the new guidelines, found that 14 agencies – which account for 50 percent of all government ICT operating expenses – were considering a data centre move within two years. A further ten agencies, accounting for an additional 10 percent of expenditure, were planning a move within five years.
Coincidental with the timing of the Gershon report, the fact that such major moves were imminent has made the AGDC a key government ICT policy moving forward. Riding on the coattails of the policy, it’s a safe bet that IT executives in those 103 departments – and in the 91 guided by the Commonwealth Authorities and Corporations Act, for whom the guidelines are optional – will at some point find themselves dealing with John Sheridan, Division Manager of the Agency Services Division within the Australian Government Information Management Office (AGIMO).
“We’ve been doing co-ordinated procurement work since March 2008, but the ICT reform program has given a new mandate for AGIMO to act as a catalyst for change,” explains Sheridan, onto whose shoulders the successful execution of the AGDC has fallen. “Where we might previously have pursued general co-ordination or tried to encourage people to do things, now we are armed with a cabinet mandate. We’re consolidating demand and ensuring all agencies follow the best-practice arrangements of the better-performing agencies.”
With the weight of ministerial mandate behind him, Sheridan says the AGDC has fundamentally shifted the power balance when it comes to getting departments to play long: “The pre-Gershon arrangements could be categorised as opt-in,” he explains, “while the post-Gershon arrangements are now opt-out.” In other words, departments are expected to fall in line – with exemptions both rare and subject to rigorous evaluation that will make them very hard to come by.
AGIMO will administer the AGDC using regular benchmarking surveys, which will be issued each July to track every department’s progress against a number of key metrics. This year is the third time these metrics – including adoption of virtualisation, usage of storage, and other measures – will have been collected; by the time the surveys are collected in October, they may begin to show the effects as agencies follow Tanner’s mandate to pursue more-aggressive data centre consolidation strategies.
Data centre best practice
Just what those strategies entail will only become clear over time. For large government agencies with many data centres, it may make sense to consolidate the equipment across those data centres into a smaller number of facilities. This approach simplifies management, but it also increases risk, changes information and physical security profiles, and heightens reliance on inter-site telecommunications – all issues that play heavily in the plans of any government CIO.
Smaller agencies face the biggest changes: with numerous small data centres typically installed around the country, it’s entirely possible that an average small department may need to radically centralise its IT infrastructure through heavy use of server virtualisation and a shift towards thinner environments at branch offices.
This was the approach taken by the Australian Health Practitioner Regulation Agency (AHPRA), a new statutory body set up to administer the new national registration scheme for medical practitioners. As a green-field deployment, AHPRA was able to make some careful choices about its data centre – which is being hosted internally – and has relied heavily on VMWare’s virtualisation platform to consolidate applications from eight previously separate state registrars.
“As a best-practice matter, it would be remiss of any organisation to not be looking at virtualisation as a valid solution for the data centre,” says Michael Hoffman, national ICT Program and Security Manager with AHPRA. “We’ll have one data centre for all the new agencies, and we are rolling a minimum of IT into the individual offices – one or two blade servers in a very small footprint requiring very few environmental controls and minimal or no IT staff.”
More ominous are the potential changes imposed by the 500 m2 minimum set in the AGDC: agencies requiring less than that space, and there will be many, will need to consider shifting to third-party hosting environments where they will find themselves cohabitating with other departments or, potentially, private enterprises. Given the tight security and governance arrangements around such a situation, government CIOs will need to work carefully to ensure data centre rationalisation doesn’t compromise any other goals.
The prescriptive real-estate requirements set in the AGDC may seem rather arbitrary to some, but they’re in line with industry best practice. “A piecemeal approach is a very expensive and uneconomical way of going about meeting demand,” says Stephen Ellis, Director of data centre operator Technical Real Estate (TRE). “You only really get economies of scale in data centres if you build data centres of a certain size: facilities that can offer at least 2000 m2 of raised floor, as a starting point, are the way to go.”
That benchmark alone will create an incentive for change at many departments, which is where the government’s panel comes in: panel membership will give Departments direction to data centre operators that have been identified as suitable targets for their remediation efforts.
Yet data centre operators anticipating a mad rush may find things a bit muted. “Speculative development of any nature has fallen by the wayside since the global financial crisis,” Ellis says. “There’s a lot of talk going around the market, but the activity coming out of it is very slow.”
Environmental nous
Expect activity to pick up over time, however, as the government’s continuing clampdown on data centre spending pushes departments to act more strategically. And, in line with growing concerns over environmental consumption in recent years, much of that action will revolve around the business case that goes hand in hand with better datacentre environmental credentials.
Once a feel-good extension of datacentre practice, environmental practice has gained currency in the post-Gershon era as agencies count the cost – not only in abstract terms like carbon emissions, but in cold hard dollars related to spiralling cooling and power bills. The AGDC report found that the government spends $170 million on datacentre electricity alone every year; of that, $100 million relates to environmental controls and just $70 million is related to the actual power to keep equipment running.
“When you think that a data centre uses ten times as much power as a standard office building per square metre, you really have to look at data centres when you talk about reducing the power footprint,” says TRE’s Ellis.
The benefits that reduction provides will bring environmental objectives front and centre as departments work to push towards or beyond Gershon’s $1 billion savings target. As a side benefit, the AGDC report noted claims that modern data centre technology can cut the government’s data-centre carbon footprint from 300,000 tonnes to around 260,000 tonnes per year.
Hoffmann has seen this effect first-hand: simply consolidating the servers of the separate agencies has seen a 20 to 25 percent reduction in power usage. “That was directly measurable in KVA usage at our UPS,” he explains. “We were able to run three times as many virtual machines [VMs] on each server as we could before; just that consolidation meant a huge difference in terms of power usage.”
Virtualisation
Tarquin Bellinger, Product Manager, Global Switch
Virtualisation does present challenges, however. AHPRA quickly found the number of VMs skyrocketed past 50 as staff realised just how easily they could create new servers. The organisation’s eight physical servers had no trouble keeping up with so many VMs, but those systems do impose a burden in terms of everyday management.
This is another reason agencies may do particularly well tapping into the economies of scale of larger data centres: they have skills and the commercial imperative to keep power and cooling costs down in ways that departments often struggle to do themselves. They can also justify, and amortise, the cost of high-end security and access control systems that may pollute the business case for smaller government departments.
Tarquin Bellinger, Product Manager with Global Switch-allied hosting provider Harbour MSP, says the company’s highly-optimised facilities are 15 to 20 percent more power-efficient than a typical data centre running out of a departmental office. Combined with their real estate efficiencies, Bellinger argues that hosting providers are already set up to support the AGDC’s goals – and foster facilities collaboration between departments working to get in line.
“There are significant improvements with purpose-built facilities,” he explains. “It’s no different than with large enterprises: many have one division doing one thing and another division doing another; often, new procurement strategies bring them together.”
The big picture
The race, then, is on: AGDC will guide the 24 agencies planning to move data centres within the next five years, potentially trimming considerable fat from the lion’s share of government data centre expenditure – and reining in steadily expanding real estate and resource requirements in the process.
While the policy doesn’t directly affect state and local-government councils, IT strategists in those organisations should also be following it closely: its principles, and the change it will engender, go straight to the heart of the change imperatives that face any organisation with a data centre. And if the Commonwealth Government can successfully slim down its data centres, any government organisation can do the same.
In the long term, Sheridan believes the widespread application of the new policy could eventually trim the government’s data centre by as much as half – slashing it to a “best possible outcome” of around 10,000 to 15,000 square metres total.Of course, whether these savings come from natural consolidation of business systems, or more dramatic efforts along the lines of the Boroondara-Melbourne facilities sharing arrangement, will depend on the departments and imperatives in question.
Department IT managers will also have to deal with another point made in the AGDC report – that data centres are cheaper to cool in climes where the air temperature is lower than 16ËšC. That could see pressure to relocate rationalised data centres to Victoria, Tasmania, or rural areas where temperatures are often a few degrees cooler than in other places. Such moves might save costs, but they would introduce other planning issues for data centre planners.
No matter what shape it takes, the journey will likely be one of support rather than unilateral pressure: every department wants to operate more efficiently, so in many cases the AGDC will provide long-desired impetus for expenditure in time, money and effort that had previously been put on the back burner thanks to the uncertainty and resource constraints of the GFC.
Philip Sergeant, Vice President of Research for data centre systems with Gartner, agrees: “Some of the smaller agencies that don’t have the colossal infrastructure, software, hardware, skills or people will probably be attracted to it,” he says.
“If there’s success shown in the short term, they may want to accelerate it. Success breeds success.”
Small IT initiatives delivering greater success for governments
To reduce the costs and political risks associated with large technology project failures,...
Addressing public sector IT complexity
Observability can help government IT leaders reduce today's system complexity and manage...
Observability versus monitoring: getting the best out of IT infrastructures
Traditional infrastructure monitoring approaches can face limitations as IT environments become...