Proactive risk advice — leading the charge
An IT audit needs to proactively assess emerging risks if it is to remain relevant, provide value to government agencies and protect against the most significant risks public sector organisations face.
In the early days of my career, I was given the opportunity to lead a conference to kick off an audit. It was on that day that I met my first Air Force general. After I had enthusiastically gone through my slides, the general said to me, “Do you know who you auditors are? You’re the ones who come in after the battle to bayonet the wounded.” As a young IT auditor, I felt crushed. I did not see IT audit or myself in that way. I was truly there to help improve things. Now, after having been an internal auditor for more than 23 years, I look back and think that the general may have been, partially, right.
Traditional IT audits tend to be retrospective. Internal auditors come in six months or a year after a project (battle) has ended — after the tough decisions have been made and the hard work completed — and second-guess (bayonet) management (the walking wounded), all with the benefit of 20/20 hindsight. Aside from the resentment, poor morale and distrust this breeds, we need to ask whether retrospective auditing really improves our organisations.
IT audit needs to shift from a retrospective audit/compliance focus to proactively assessing emerging risks if it is to remain relevant and provide value to our government agencies. Although retrospective auditing has an important role in helping ensure that controls are working, some of the biggest threats to our agencies are those we have not seen before or are very complicated and push us out of our comfort zones. Not only are we underestimating the value that we could be providing when we limit ourselves to retrospective, compliance-based audits, but increasingly, with risks associated with large public-facing system implementations, complex regulatory environments and cybersecurity, we are ignoring the most significant risks our organisations face.
Don’t wait until it’s too late
It is in our nature as internal IT auditors to want to ensure that the things we audit are in compliance with applicable rules and regulations. However, we need to avoid the trap of blindly enforcing flawed rules. We need to be asking, “Does this rule make sense?” The 2008 US mortgage crisis serves as a compelling example of what I call ‘compliance myopia’. Using a compliance-based checklist, even the most byzantine of mortgage products that were available in 2008 would likely have passed an audit or regulatory review. The form was correctly filled out for the sub-prime loan — check! However, the checklist did not have a box that asked whether this was a seriously flawed loan product that would ultimately pose an existential threat to mortgage companies and banks offering it.
This is not to say that auditors should unilaterally stop enforcing regulatory requirements or other rules. However, we should use our role as a bully pulpit to get tragically flawed rules corrected and not wait until our organisation — or global economy — is brought to the brink of disaster. As a profession, we need to move from a pure compliance focus to a strategic, risk-based focus.
In providing strategic, risk-based focus, there are several areas in which we can be proactive, such as cybersecurity, systems development, fraud detection, emerging technology, sourcing and procurement, performance evaluation, cost benefit analysis and the budgeting process.
No government agency has ever ceased to exist because it failed a timecard audit — but what about resignation; or loss of funding due to a major hack and loss of intellectual property; or a database breach that compromises citizens’ personally identifiable information (PII); or a multimillion-dollar system implementation failure?
Take, for example, several recent high-profile US government implementations, such as the system to sign up for government health care and the system to apply for government federal positions, which cost millions in taxpayer dollars yet were regarded as failures. What risks and reputational damaged could have been avoided if, as IT auditors, we were more proactive and advice was heeded about potential dangers and warnings?
Of course, Australia’s government is not immune either. The recent data breach of PII by the Islamic State made publicly available the information of Australian Defence Force employees and their relatives, and urged home-grown terrorists to attack them. In fact, a global cybersecurity survey of ISACA (Information Systems Audit and Control Association) members earlier this year found that in Australia/New Zealand, 61% of businesses and government bodies are expecting a cyber attack in 2015 — but less than half of ANZ IT professionals (43%) say they are prepared.
If we wait until six months or a year after strategic risks have occurred or data has been compromised, it may be too late. We need to get ahead of these risks, identify the vulnerabilities and make recommendations to fix them before they are exploited.
So what is stopping us? We are. Internal IT auditors fail to create timely, proactive, risk-centric, service-orientated audits by misinterpreting independence and lacking strong relationships with management and the audit committee.
Maintaining independence
Maintaining our independence is crucial if we are to provide unbiased recommendations. Although we should never make management decisions, this does not prevent us from providing proactive, risk-based recommendations. Consider the example of most major system implementations. They can be very costly (eg, system integrators, software and hardware), customer-facing, pose security risks to our organisation and the users if not correctly configured, and could damage our agency’s reputation and credibility if not correctly deployed. In fact, the number one reason for going over budget on system implementation is change to the scope of works.
We don’t have to wait until after the system has been deployed to assess whether:
- the project team has mapped the system design to regulatory and functional requirements;
- basic project management practices are in place and include provisions for robust testing;
- contract terms are being met;
- internal controls have been considered; and
- people who will handle PII have undergone background checks.
In my department, we are involved in testing a new system implementation before it goes live. We would rather find configuration problems or security issues prior to the system being fully implemented. This helps to avoid costly redesigns and potential security breaches. We also conduct ‘white hat hacking’ on our network. If we can find ways in, you can bet the hackers can, too.
Early involvement and prioritisation
Even if we can all agree that proactive, risk-based auditing does not affect our independence, we may not have the kind of relationship with management and with our audit committee such that they would welcome our involvement. Building the right relationships requires consistent and high-quality products; candid, professional and frequent meetings; and highly trained and diversely skilled staff.
The objective is for management to see the auditor as a proactive risk adviser who will provide added assurance that management has considered a wider variety of risks than had management gone it alone.
When we start adding the largest threats to our audit plan, it can feel a bit overwhelming. The trick is prioritisation. Auditors should talk with the heads of the department and the audit committee and develop a collective understanding of the risks the organisation faces. This will provide a basis to prioritise resources and audit those things that present the highest level of risk.
If that leads to an area not addressed before, such as cybersecurity, the auditor will have to make a ‘build vs buy’ decision. Does the chief audit executive (CAE) have the requisite skills on staff that, with some training, will be able to use available industry best practices to assess cyber vulnerabilities? If the answer is ‘no’, the CAE will have to buy those skills by hiring outside resources. Although contracted resources can initially be expensive, avoiding existential risks, such as cybersecurity, is not an option — ignoring these risks will not make them go away. For starters, the CAE should ensure he or she builds into contracts the requirement that the outside experts train the audit staff so that there is a sustainable model to address these risks in the future.
Internal auditors are perfectly positioned to see across an organisation and to understand overarching risks. By being proactive and looking at issues of strategic importance, auditors can strengthen their organisations and help them successfully navigate the risks of an increasingly complex and dangerous world.
Building secure AI: a critical guardrail for Australian policymakers
While AI has the potential to significantly enhance Australia's national security, economic...
Building security-centric AI: why it is key to the government's AI ambitions
As government agencies test the waters of AI, public sector leaders must consider how they can...
State government agencies still struggling with securing user access
Audit reports have shown that Australian government agencies in four states experience challenges...