It's time to re-evaluate public cloud migration


By Jason Van der Schyff, COO, SoftIron
Wednesday, 22 September, 2021


It's time to re-evaluate public cloud migration

Does the new approach to builidng national critical infrastructure go far enough? DTA’s Hosting Certification Frameworks are a step in the right direction, but true data sovereignty should begin at a hardware level.

The proposed Hosting Certification Framework introduced by the Digital Transformation Agency earlier this year is a call for a new approach to building critical national infrastructure.

The framework stipulates that all relevant government data will need to be stored only in either ‘certified assured’ or ‘certified strategic’ data centres. This new approach follows concerns about the acute data challenges confronting the Australian public sector and aims to mitigate against data centre ownership risks, including data sovereignty, supply-chain vulnerabilities and cybersecurity threats.

This is a very timely addition to the regulatory framework for Australia at a time when nations around the world are responding to the challenge of building sovereign resilience. At a hosting level at least it provides the framework for better auditing and assurance of those facilities and their operation.

However, there’s an issue. Are we shutting the cabinet door after the data has already bolted? What about all the data and services that have already left the infrastructure and facilities run by our public servants and are now run in the public cloud?

The scramble to the cloud in the pandemic

Migration to the public cloud in both the public and the private sectors has been happening over perhaps a decade or more now, but has only been accelerated in the last 18 months as organisations rushed to the cloud to maintain operations during the pandemic.

However, organisations are now left figuring out how to adequately protect both data and services that are hosted in the cloud. Cloud services are essentially akin to ‘someone else’s computer’, with many hidden behind a wall, meaning organisations who are using cloud services have little to no real visibility on how the infrastructure is really and instead must rely on ‘trust’.

Time to take a breath and review where we are. The way in which organisations, particularly the Australian public sector, store and manage data now has the opportunity to change for the better. It’s important for government agencies to understand the four key areas of concern for cloud services and how to overcome these challenges as they look to benefit from the new hosting framework.

And, to identify the best pathways towards sovereign resilience we first need to understand the recent drivers of the mass move to the public cloud, and the common underlying issues in the infrastructure being used to provide public cloud services.

So many eggs, so few baskets

The benefits of the public cloud are clear: reduced management and maintenance overhead, pay-as-you-go provisioning of popular software tools and platforms, and access to the resilience and availability a large-scale provider can offer, to name a few. But those benefits don’t come without their fair share of risks; by outsourcing such a large volume of data and services to such a small number of providers, the nation’s financial stability is at risk.

Accelerated public cloud adoption — the response to a world in lockdown

The COVID-19 pandemic has understandably been a massive driver of public cloud adoption in recent years, bringing about a dramatic shift in how and where data is created, processed and stored. This has presented ICT teams in both the public and private sectors with considerable challenges, as entire workforces switched to remote work near-overnight.

For those organisations previously based entirely around on-premises operational models, this was a major shift, and few had the onsite infrastructure or support network to handle the sudden demand for cloud services.

The public cloud is at its best and most useful in such scenarios — providing a scalable and elastic buffer of computing power, delivering services on-demand, where and when required, while handling peaks and troughs with ease.

It’s no surprise then that across many sectors, particularly the government, the transition to the cloud has been swift. There can be no doubt that access to public cloud services is now a critical tool in every industry.

‘Just someone else’s computer’

As organisations are adjusting to the new way of working, the complexity of the public cloud and the risks this poses, are becoming more apparent. Cloud services are sometimes referred to as ‘just someone else’s computer’, and this captures the core risks of the public cloud entirely.

Public cloud vendors will assert that in not disclosing information they are, in fact, doing you a service, as vectors of attack are obscured. This provides little assurance for regulators and others with vested interests in securing business and economic continuity should downtime occur or, worse, complete loss of data or services in the event of an attack.

Compromised systems cannot be saved from security controls

The industry today takes a largely ‘information-centric’ approach to security. That is, assume that every system is compromised. Although not all systems are compromised and controls can generally only be applied to hardware systems that have already booted up and loaded their operating systems.

This leaves a window of opportunity for bad actors to infiltrate hardware and execute attacks that alter normal operations during the bootup process before the usual controls can be applied. This can lead to disrupted operations, monitoring of sensitive information, stolen or corrupted data, or even instances where complete control over a system can be taken.

To overcome these challenges, organisations need to achieve true data sovereignty, which needs to begin at a hardware level. The only way to ensure each and every component on the circuit board is doing its correct assigned task is through hardware that is transparent and auditable through Secure Provenance.

Secure Provenance ensures that the appliance is true, that it is precisely as designed and specified, with no hidden code or additional components. Achieving this means organisations can have a 360° transparent view into the entire design, supply chain, manufacture and delivery path of data centre appliances.

While cloud services offered organisations a scalable solution to maintain business continuity at the beginning of the pandemic, the lack of visibility and transparency means Australian data is at risk. Instead of hosting data on someone else’s computer, organisations need to strive for Secure Provenance.

The Hosting Certification Frameworks proposed by the DTA are a step in the right direction to mitigate against data centre ownership risks. However, true data sovereignty needs to begin with hardware that is hosted within that framework to ensure that it is true, transparent and completely auditable.

Image credit: ©stock.adobe.com/au/Alex

Related Articles

The benefits and risks of AI usage in the public sector

The coming year will see some fundamental changes in the way the public sector manages and works...

How surveillance cameras facilitate a smarter and safer world

As Australia's population continues to grow, surveillance technologies will be crucial to...

Adapting to new cybersecurity challenges: a roadmap for Australian government agencies

Given the rise in cyber threats against government networks and critical infrastructure sectors,...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd