The evolution of government analytics
The ready availability of both data and analytics is key to the digital transformation culture that is sweeping the Australian public service.
If ever there was a sign that New South Wales is taking IT transformation particularly seriously, it came with the August announcement that the state would establish a whole-of-government analytics centre (DAC) that — unlike former reform efforts such as the customer service-focused Service NSW, which is working to correct past shortcomings in government service — will see the state proactively embracing new technologies to improve its future.
It’s a big difference in philosophy that reflects both the rapid mainstreaming of big data analytics capabilities and the maturity of government service delivery that has previously been implemented with varying degrees of success by individual government departments depending on their resources and capabilities.
“Data is one of the greatest assets held by government, but when it’s buried away in bureaucracy it is of little value,” NSW Minister for Innovation and Better Regulation Victor Dominello said in announcing the strategy, which has entrusted the shaping of the DAC to an expert steering committee and recently saw the appointment of Dr Ian Opperman, former director of the CSIRO Digital Productivity Division, as acting NSW chief data scientist.
Significantly, the NSW Government is backing the DAC with legislation that will prevent it becoming a toothless tiger: the Data Sharing (Government Sector) Bill 2015, passed through Parliament in November, not only gave Opperman the power to demand datasets from NSW agencies but has set the ball rolling for a sea change in government — predicated on the open availability of government datasets, such as a mass of data from NSW’s Opal card system, to be published this year through the state’s Open Data Hub.
If the availability of this and myriad other datasets reflect the government’s commitment to open data — outlined in the state’s 2013 NSW Government Open Data Policy and similar guides such as Victoria’s DataVic and Western Australia’s open data site — then the formation of the DAC represents the fruition of that vision by providing the supporting analytics capabilities to help agencies make the most of it. This focus on execution puts it ahead of other efforts such as the Whole of Government Data Analytics Centre of Excellence, which has the ATO as lead agency but is focused more on advice rather than execution.
The open exchange of a broad range of datasets “is really what should be defining big data”, said Theo Gazos, director of analytics specialist firm Predictive Analytics Group, who argues that big data adopters “should, before they run out to spend millions on technology designed to capture all of this information, think about analysing the information that they currently have and collect”.
“It’s all about applying the right analytics,” he said, “to extract the insights that are hidden in the data — and being able to generate new forecasts as projections change.”
In algorithms we trust
Centralising the data analytics function — much as has been done at the state level with shared services or data centre facilities in the past — has emerged as a global trend as analytics is pushed to every corner of government. US states such as North Carolina and Indiana have already used the approach to simplify access to big data capabilities that have previously been hamstrung by competition between departments for limited funding.
Such competition tends to favour larger departments with better delineated business cases, leaving smaller departments and peripheral agencies to go it on their own or forego analytics entirely. Previous efforts have previously tried to address this issue at various levels, with guidance such as the Victorian Government Reporting and Analytics Framework and the federal government’s Australian Public Service Better Practice Guide for Big Data, issued in 2015.
Yet ready availability of both data and analytics are key to the digital transformation culture that is sweeping the Australian public service, particularly on the back of the Digital Transformation Office’s (DTO’s) commitment to integrating analytics and its role within the DTO’s Digital Service Standard.
This transformation also reflects the transition to what Gartner has labelled “the algorithm economy”, a maturation of analytics capabilities that recognises the growing importance of analytics as a strategic tool for both private and public sector organisations.
“The algorithm trumps the data that it accesses,” Gartner’s manifesto explains. “The next digital gold rush will be focused on how you do something with data, not just what you do with it... Organisations will be valued based not just on their big data, but the algorithms that turn that data into actions and ultimately customer impact. For CEOs it’s a call to focus now on their proprietary algorithms, not just their big data.”
Algorithms will come into their own not only as tools for generating new revenues — relevant in the private sector context — but as conductors for machine-to-machine (M2M) interaction that will complement public sector agendas around the construction of smart cities, already highlighted through studies in Queensland cities such as Townsville and the Gold Coast.
In these contexts, operational efficiencies will come not only from the analysis of historical data, but the quick application of algorithms to ongoing feeds — from sensors, capital equipment and even buildings — to optimise their operation in real time. This change will not only require CIOs to weave analytics capabilities into their analytics planning, but will demand better relationships across departments and between agencies to identify functional commonalities and exploit mutually beneficial operational datasets.
Centralising the analytics function
The specifics of the DAC’s architecture are still being nailed down, but given trends in analytics delivery it’s likely there will be a heavy cloud component. Broader use of cloud-based analytics will provide impetus for this transformation — as will technology providers such as SAP, which has joined its competition in pushing its analytics capabilities into the cloud… hard.
“This is a beautiful situation for our public sector customers,” said Clemens Praendl, global head of analytics with SAP. Fragmented analytics systems, he said, mean that “organisations are running a more intransparent model than a transparent model, with all of these islands in organisations where decision-makers don’t have the overall view to make appropriate decisions.
“If we could bring that data together,” he added, citing SAP’s recent work in helping the US state of Indiana hone in on prenatal care deficiencies, “government organisations could use their data and their labour much more efficiently. They were previously trying to make good decisions based on the information they had — but they didn’t have all the information because they hadn’t brought all the data together.”
Recognising its strong presence within government, SAP recently launched a Canberra-based Institute for Digital Government, which represents a $150m commitment from the company and will have analytics as a core part of its work.
Governments “typically don’t even know what their services cost because there is no measurement of this”, Isabelle Groegor-Cechowicz, SAP global general manager for public services, recently told GTR. “There is no analytical side to it. Which means we are at a very interesting point to be able to provide the technology that will be the platform.”
Despite growing government enthusiasm for all things cloud, many agencies will need to remember the sensitivities around their data before rushing wholeheartedly into publishing and analysing it. This may drive many agencies to build hybrid models that combine cloud-based analytics with on-premises systems, providing both a smoother migration path and a fallback approach that respects individual governance controls.
“The need for a hybrid set-up is becoming even more important because there are certain data that are sensitive,” said Dan Miller, APAC director of cloud with analytics giant Splunk, which has been working to bolster analytics techniques with the addition of machine-learning capabilities to smooth analysis of large datasets around security and related areas.
Australian users “are probably more ahead than the US in terms of people’s willingness and readiness to go to the cloud”, Miller added. “The model is very much aligned with investment growth.
“You can have the end-to-end visibility for your security posture, SLA management or anything else you might want,” he continued. “The key for government bodies will be aligning take-up with customers’ strategies around what they are doing, and their data architectures. Customers understand that there no longer needs to be a trade-off between managing workloads on premises and losing visibility.”
The analytics transformation
In government as elsewhere, the spread of analytics has long passed tipping point; as the technology evolves from technological novelty into policy enabler, agencies will increasingly need to build analytics capabilities into all manner of IT-led business projects.
This change has been highlighted by Gartner as one of 10 key strategic technologies for government going into 2016 and beyond, with worldwide government spending on ICT products and services expected to grow from US$431 billion in 2015 to US$475.5 billion by 2019.
“These strategic technology trends have substantial disruptive potential that is just beginning to materialise and will reach an inflection point within the next three to five years,” research director Rick Howard said.
“Public sector CIOs can capitalise on the value of these trends by first determining how they will impact government program operations or service delivery models, and then by building the organisational capabilities and capacity needed to support them.”
In many cases, this will involve the tailoring of analytics capabilities for mobile-wielding users that will increasingly turn to real-time analytics services as a driver for decision-making on the fly.
This expansion of analytics towards the edge of the organisation — and powered by centralised analytics of the type the NSW DAC will deliver — will, Gartner has predicted, involve the application of algorithms and cognitive computing to “make real-time assessments about what will happen or what should happen”.
They will also be pervasive — embedded into business processes and applications — and invisible, operating continuously in the background to track user activity, process sensor and environmental data, and dynamically adjust workflows as events unfold.
Yet even as the governmental analytics capability continues to mature, some analytics experts warn that many of the old rules apply. Executive ownership, in particular, is critical: “You’ve got to have somebody that owns this, owns its budgeting and direction, and owns the making sure that there is consistency across it,” Bill Franks, Teradata chief technology officer, explained.
That analytics owner will both handle the internal analytics agenda and the relationship of the government organisation with centralised service providers, Franks said, noting that in many companies, executives have no idea who they would contact with an analytics-related enquiry.
“When something becomes important to an organisation at a strategic level, there is an executive owner of that,” he explained. “Without somebody owning it, you can’t possibly really make it an embedded part of what the organisation does every day; you can’t possibly succeed in having massive-scale operationalised analytical processing. The CFO and everyone will know that person’s name.”
Building a plane while you fly it: challenges in public sector digital transformation
Achieving flexibility becomes possible when implementing an agility layer, as it provides the...
Automated decision-making systems: ensuring transparency
Ensuring transparency is essential in government decision-making when using AI and automated...
Interview: Ryan van Leent, SAP Global Public Services
In our annual Leaders in Technology series, we ask the experts what the year ahead holds. Today...