Robo debt and digital government: failure demands a broader redesign

Digital government in Australia needs aa redesign of the big systems exposed as flawed by the Royal Commission.

Written by

Lesley Seebeck
Getting your Trinity Audio player ready...

In the preface to the Report of the Royal Commission into the Robo debt Scheme, Commissioner Holmes reflects how she was startled at the ‘myriad’ of ways the scheme failed the public interest.  A fundamental failure was institutional—the Australian Public Service (APS) did not live up to its formal responsibilities or accountabilities, the trust placed in it by the Australian public, or the expectation of it as an institution underpinning democratic governance.

However, to quote from the Royal Commission itself, ‘[t]hose hoping for recommendations for wholesale reform of the APS may be disappointed.’ (p637) Indeed.

Solutions are possible – focusing on creating a model for digital government, giving citizens rights to their data, providing the APS with a ‘sandbox’ to model, test and harden both policy ideas as well as tech solutions, and strengthening the Commission’s recommendations aimed at greater transparency and accountability, especially when it comes to system design.

To be fair, the task of wholesale reform of the APS is beyond the scope of this Royal Commission. The Commission report makes it evident that there is much that the APS should have been doing, by its own legislations, rules and practices, that it wasn’t.  There’s also much that’s been broken over time—for example, the dis-establishment of the Administrative Review Council, and failures in the NPP and budget process. 

The task of fixing the APS is immense—it is beyond self-repair. It is also beyond the 2019 Thodey Review, on which the Commission lays considerable store. The Thodey Review addressed an APS of the mid to late 2010s; matters have slipped further. And neither it nor the Commission really tackle the issue of technology. 

All policy delivery, whether the payment or clawing back of welfare, is underpinned by technology systems. Most personal data is split across two agencies, each with massive technology shops, Services Australia and the ATO. Many decisions are facilitated by data matching between the two agencies, each with its own legislative requirements and standards. 

The Commission considered automation of decision-making in Robodebt ‘a massive systemic failure’ (p484).  The management of data matching—from the data-matching, through to its use and disposal—was far from rigorous and shown to be in breach of legal and regulatory requirements and voluntary codes. 

While the Commission focused on the legislative landscape, it is impossible to separate the policy from its realisation in the technological substrate. Moreover, the technology substrate has a long tail—it, and how it treats data and by extension, people, will likely continue well after the purpose of the original policy that brought into being has been forgotten. While the Robodebt PAYG data-matching itself may have ceased, data exchange programs (p460) and, more than likely, applications and systems have been retained. 

Technological systems do not lend themselves to the grace and forgiveness needed when dealing with the natural wickedness and changeability of people’s lives. As is evident in the Report, they tend to be rigid, they reduce people’s circumstances to notation for the purpose for the purpose of processing and in doing so they lose context. In short, even if the legislative aspects are sorted out—unlikely—the very nature of reducing aspects of people’s lives to be, in James C Scott’s terms, ‘made legible’ by such technological systems is likely to continue to cause harm.

Moreover, even were such systems designed with a user-first or user-centred perspective (they weren’t), or for that matter, using security- or privacy-by-design principles (also not evident), they would need to be operated, updated and maintained with those perspectives foremost in mind. In contrast, the Commission Report presents a barrage of legislative and regulatory requirements ignored, and lack of governance, documentation and accountability in the design, operation and management of the data and IT systems.

The Commission suggests that the prospect for increased use of automation and AI is ‘not all doom and gloom’, with the possibility of considerable upside ‘when done well’. But—and it is a big but—the ‘concept of “when done well” is what government must grapple with as increasingly powerful technology becomes more ubiquitous.’(p488) 

It’s hard to consider that to be reasonable prospect. There is little reason to think that the agencies concerned—Services Australia, the ATO—will have improved their culture, behaviours, processes, governance, and expertise sufficiently to generate adequate confidence in their handling, use and retention of citizen data. Regulatory authorities, such as the OAIC and Ombudsman, have proven timid. Recommended changes may help, but there will be questions about their technical expertise and ability to enact change given the inertia of the agencies’ tech stack. 

The push for data-matching, automation and AI, continues apace within government without engagement, accountability, or transparency.  The drivers are multiple: vendor pitches, ministerial frustration, efficiency drives (especially when attached to new policy proposals), and perhaps most corrosively, administrative convenience, as in Robodebt. The budget process encourages fragmented and opportunistic proposals that tend to overpromise both in terms of outcomes and timeframes. 

There is nothing in the current reform program that addresses the factors shaping the current technology environment, from either the demand or the supply side. At a minimum, without a coherent, conceptual model of digital government—appropriate to a digital democracy—and a means of forcing adherence to that scaffolding, the Australian government will continue to be plagued by piecemeal, jerry-rigged, insecure, vulnerable, ill-adaptive and expensive technology that in turn will contaminate policy delivery and impair government.

Last, data-matching. Data-matching was introduced by Hawke government in 1991.  The mindset guiding its implementation since that time, reinforcing its capability with massive data collection and technological capacity, is one of enforcement, backed by punitive measures: Morrison’s ‘welfare cop’ language is not atypical (p28).

Data-matching clearly provides considerable information, surveillance capacity and potential coercive power by government over individuals: there is a massive power imbalance. Critically for a democracy, there is no transparency, reporting or accountability to the public, including on people’s own personal data and its use, nor any substantive means for recourse when, as in the case of Robodebt their data has been misused. Instead, it is seen as a resource, a ‘gold mine’ (page 457), for the convenience of government officials. That breaks trust with the public and undermines democracy—not to mention represents a massive honeypot from a cybersecurity perspective. 


There are no easy, quick fixes: the corrosion is deep. However, we need to start.

First, generate a coherent, conceptual model of digital government, and use that to scaffold future technological choices.  Create an agency—not the DTA—with the remit and funding to enforce both the conceptual model and government choices.

Second, provide individuals with the inalienable right to their personal data and control over its use, with sanctions over its illegal access and use. This requires a fundamental in thinking about individual rights, privacy and security; it will also require a re-engineering of how agencies deal with personal data. Those agencies must be held to account in how they manage, secure and use people’s data.

To those I would add, from James C Scott, a third on the design of new policy and delivery mechanisms: take small steps, assume surprises and ensure reversibility.  Setting up an area inside the APS with a sandbox to model, test and harden both policy ideas as well as tech solutions will help.  It may also help break down the too-big-to-fail monoliths that have been created through policy fiddling and tech accretion over the decades in both the ATO and DHS, and help introduce new ways of policy development and technology design.

Fourth, broaden and strengthen the Commission’s recommendations aimed at greater transparency and accountability especially when it comes to system design, procurement and modernisation.  Too much is hidden under the veil of secrecy provisions when in reality, it is more for convenience and to avoid scrutiny. To fully embrace concepts such as security-by-design, algorithmic fairness and privacy-by-design, technological systems and algorithm need to be exposed to trusted independent externals for testing.

Last, people. It is not simply technical skills—which the APS desperately needs—that matter. A modern, forward-looking, dynamic institution should favour a set of character traits, such as non-conformity, curiosity, independent-mindedness, and forthrightness, that were rarely apparent in the Robodebt saga. Significant experience outside both Canberra and bureaucratic institutions, including the military should be encouraged.

Many of these changes point to broader structural reform. Because a fundamental institution of democracy has been allowed to wither, we now find ourselves fixing the plane while in flight.  But we have little choice.  Let’s not find ourselves in the same situation again.