Binary Balance logo

Binary Balance

Life on the digital knife edge

DTA assessment and what it means to the Australian Government

If you’ve spent any time working in IT within or around the Australian Government, you’ve probably heard about the Digital Service Standard, administered by the Australian Government’s Digital Transformation Agency (DTA). But what does the Digital Service Standard really mean for those who work in/around government today?

A staff member of the DTA once told me that to them the Digital Service Standard represents permission for government to work in a new and different way (well, new and different to government perhaps). In large part, it’s permission for government to take inspiration from agile delivery principles and start-up culture in aid of providing better services to citizens. In the Australian Public Service however, this is more easily said than done as a large part of agile delivery and start-up culture is predicated on the assumption that the groups and organisations involved will be small and nimble, two adjectives that typically do not apply to government agencies. This was but one topic that I discussed with DTA representatives at a field trip taken earlier this year to the Canberra DTA office to observe a Digital Service Standard assessment for a government service known as BloodNet, managed by the National Blood Authority.

Information on how government services are assessed against the Digital Service Standard is available on the DTA website but seeing an assessment in the flesh, so to speak, provided useful clarity in regards to the practicalities of the process. The assessment covered approximately four hours and while I initially thought that this sounded like an overly large amount of time, as the DTA representative facilitating our observation explained, considering that an assessment can include covering the full 13 service standard criteria (depending on whether it is an Alpha or Beta assessment), plus an initial introduction to the service being assessed, four hours can be occupied fairly easily and indeed this was the case on the day. The results of the assessment we observed can now be found on the DTA website, by the way.

My main takeaways from our observation were:

  • I believe there is a reason why the DTA puts “Understand user needs” as their very first criterion of the Digital Service Standard, I personally see this as being the central and underpinning requirement of the Digital Service Standard – if your delivery team is not in some way actively engaging with real live end-users of your service, then you are not meeting the Digital Service Standard. Why does the service exist? What is the user need that has given rise to its creation? Can the delivery team provide an “elevator pitch” that accurately and succinctly answers these questions? What has changed the most due to user feedback (e.g. initial assumptions and design)?
  • Show, don’t just tell – showing a prototype of the service is well regarded and evidence of what work has been done to meet the Digital Service Standard, be it user stories and/or user personae based upon consultation with real-world users, user pathways through the service in development, and so on will go a long way to satisfying the requirements of the Digital Service Standard.
  • As far as the DTA is concerned, the Digital Service Standard applies to public facing portions of government services, the DTA is not interested in the inward facing portions of government IT solutions. However, there is nothing stopping individual government agencies from applying similar principles to inward facing components.
  • A “mock” assessment session may be useful to a delivery team before a real assessment is done. This can allow the delivery team to better understand how an assessment is conducted, what will be covered and what they need to do in order for the process to be one that adds value to their project.

Agile in name only?

But what is government really trying to do here? Is it paying lip service to an imposed standard or is it trying to meet the spirit of that standard? I would hope the latter and so I am particularly interested in thinking about how government can truly be agile. As mentioned, with the introduction of the Digital Service Standard government now has permission to be so. But permission alone does not an agile environment make. Subsequent culture change is required. The Digital Service Standard alludes to agile concepts across a number of its criteria with Criterion 3 being the most explicit about it. I personally see this as the single most challenging aspect of the Digital Service Standard for government to adopt and simultaneously the most valuable if such adoption can be achieved.

I recently attended a talk given by Pat Reed, agile coach extraordinaire, whereat she discussed a number of relevant points as they relate to adoption of agile processes and user-centric design (both in large and small organisations):

  • Feedback loops that actually result in change tend to be a helpful thing if one wishes to maintain a nimble and productive environment.
  • Apply “first principle” thinking, i.e. do your own thinking and focus on the essence of a problem.
  • Are we delighting our users? Granted, ”delighting” may be a strong word to apply in some government scenarios where users may have no choice but to interact with a given government service, but the sentiment is no less relevant. To delight users, we must know who they are. It’s also important to note that the person or persons funding a project are stakeholders, but typically not service users.
  • Understand what success looks like to the users and why it looks how it does. How does the user define value?
  • How are we doing based on our commitments to our users? Are we creating value for our users or just waste outputs? Waste here is defined as all those things a project generates that do not directly lead to delivering value into the hands of a user, a general rule of thumb being that waste is anything that does not end up in production: reports, measuring variance to planned forecasts, velocity, etc. and while some amount of these outputs may be necessary – particularly in the public service – they should be minimised.
  • Impediments should be taken responsibility for and resolved. At the least, impediments need to be clearly highlighted.
  • Failures should be recognised quickly and de-funded.
  • Re-frame challenges into opportunities – constraints can be a good thing as the proudly bootstrapped (and well known within web development circles) American company Basecamp (previously known as 37Signals) has opined for many years.
  • It is a skill to be comfortable on the edge of chaos.
  • Measurable outcomes are your friend but ensure you are measuring the right things, e.g. value delivered over time, quality and technical debt.
  • Uncertainty around effort estimates is greatest at the start of projects and people are inherently bad at effort estimation. Making assumptions about effort estimates and then treating them as facts is enabling the delusion that accurate estimates are possible early in a project’s life cycle. One form of the previously mentioned waste is the result of trying to establish accurate upfront effort estimates, all this truly accomplishes is imposing pointless stress on the staff involved. Change the question, ask instead: how much is the project owner prepared to invest? Re-estimate as the cone of uncertainty narrows. Read Software Estimation: Demystifying the Black Art by Steve McConnell, published via Microsoft Press (no seriously, if you are interested in practical, applicable software estimation techniques that actually makes sense then read this book - an excerpt is available online).

Big agile?

Organisation size also tends to inevitably come into play when I consider agile. I’ve worked in small to medium sized businesses all the way up to the largest Australian Government departments and it’s self-evident to me that the larger an organisation grows, the less agile it will by default become.

It is interesting to theorise then on how a large organisation like a government agency could best support agility and innovation.

WL Gore & Associates, Inc. is a $3.1 billion American multinational, founded in 1958 by Wilbert Lee Gore and his wife Genevieve Walton Gore with a fairly unusual culture. The company specialises in products derived from fluoropolymers and is best known as the developer of waterproof, breathable Gore-Tex fabrics. Why is WL Gore & Associates relevant to agile user-centric delivery and thus the Digital Service Standard? They are a large, successful organisation that has found a way to stay agile and continue to foster innovation despite their size. WL Gore & Associates has many intriguing practices but the one that I would like to highlight here is Wilbert Gore’s belief in the need to “divide so that you can multiply”. If I may quote from the previously linked article:

When units within WL Gore grow to around 200 people, they are usually split up. These small groups are organised in clusters or campuses ideally with a dozen or so sites in close enough proximity to permit knowledge synergies, but still intimate and separate enough to encourage ownership and identity. An accountant might complain that this creates duplication of costs; Gore believes those are more than offset by the benefits smallness brings.

Why does Gore believe this? I suggest it is because Gore has realised that beyond a certain organisational size, agility and innovation is inevitably subsumed and hamstrung by red tape, by onerous governance procedures, by illogical levels of risk aversion. If Australian Government agencies are ever to truly embody some of the most important principles of the Digital Service Standard, then they too must explore how they can find their own way towards “the benefits smallness brings”.

Get involved

The Digital Service Standard represents the first step in a long journey for the Australian Public Service. The opportunities afforded by this journey are many and as IT professionals working with(in) the public service on projects assessed against the Digital Service Standard and/or assessing services against the standard ourselves, we are the guides at the forefront of this journey.

The only thing necessary for the triumph of organisational inertia is that innovative people do nothing.

Got a question or comment? Hit me up on Twitter