September 18, 2024 ↘︎

Brands, media owners, loyalty operators face difficult fast decisions on all automated decision-making as Privacy Act forces policy – maybe business model – rewrite

There’s much more work to be done than simply writing a new privacy policy as new laws go before Parliament.
Loading the Elevenlabs Text to Speech AudioNative Player...

“Every company is a software company,” as Watts S. Humphrey once famously observed, but this is probably not what he had in mind. As brands bid to wring every ounce of profit from every dollar earned through the efficiency of automated decision-making that understands customers better than they understand themselves, they now face a new and potentially costly risk – penalties in the millions of dollars, and a regulatory regime that makes it easier for the market policemen to write you a ticket. What’s more, it’s not fully clear how broad the definition of what constitutes automated decision-making will be – which puts almost everyone on notice. Privacy specialists Mi3 spoke with stressed there’s much more work to be done than simply writing a new privacy policy. For starters, you have to adhere to it. That might mean costly and complicated business and system process changes that move at the speed of compliance – but in a market powered by the accelerant of generative AI.

What you need to know:

  • It’s time to take your privacy policy seriously. If you are using customer data to automate decisions, your customers need to be able to understand that from your privacy policy. If it’s not simply disclosed and clearly articulated, expect fines.
  • Those fines will come thicker and faster thanks to two new penalty tiers including “administrative” breaches with fines up to $330,000 that the regulators can levy without going to court.
  • There’s also a mid-tier level of fines that could cost you over $3m for breaches that don’t rise to the worst case outcomes. 
  • Then there’s the small matter of how customers react to all the new transparency, especially as they have new opportunities to pursue companies through a statutory tort, although there is no direct right to action – something privacy experts say previous governments have been loathe to introduce.
  • Privacy consultants say brands they work with are already in breach of the old rules, let alone the new ones.
  • And don’t fall into the trap of believing it’s simply a case of updating the privacy policy, you also need to make sure that business processes reflect the policy, and that in turn could lead to system changes and additional costs.

Here’s what’s changing: If your software uses personal information to make significant decisions without human intervention about customers or prospects or consumers, you need to disclose that in your privacy policy. That’s the easy bit.

What’s more complicated and fraught is that you also need to ensure that your business processes reflect what you say in your policy, which may require your organisation to change the way its systems work, and how you collaborate with partners whose own lawyers probably want a quick word, or maybe a very long one with lots of warrants and sub-clauses.

Pull on the thread and the whole damn rug starts to unravel.

The privacy law grants your customer a right to action – basically, a valid reason to pursue legal proceedings based on a specific set of facts or circumstances that may have caused them harm or injury, something governments have been loathe to agree to in the past, say privacy experts.

Automated decision-making (ADM) was one of 30-plus areas of privacy in the long-running conversation that the government initiated before the legislative update where it said its mind was made up and no further discussions were required. Many of the other items on the list have been pushed back to next year, but ADM made it through, which speaks to how it is viewed as a priority.

While definitions of the “significant harm” that a decision might visit upon a customer can already be discerned from the Australian case law, for now, there is no detailed definition in the legislation about an automated decision.

On the current reading of the legislation introduced to Parliament last week, it could be everything from those huge multimillion-dollar real-time decisioning ecosystems that Commbank, NAB, and ANZ are building on the back of Pega’s software, through to something as simple as a Java script tag on a web page that triggers a decision.

Industry leaders Mi3 spoke with offered a wide set of opinions about the current practices that could qualify as potentially rendering significant harm and which are already common in business applications today – accepting or denying credit card applications, using loyalty programs to offer differential experiences (“Welcome to the President’s lounge, Madam”), accepting or rejecting a job application based on AI analysis of a CV, or surge pricing based on the user’s behavioural profile as defined by the data you hold, including how panicky they get when the battery on their phone turns from reassuring green to OMG-red.


Applicability

Richard Taylor, managing director of Digital Balance, a Melbourne-based DX agency sees wide applicability across sectors given the prevalence already of automated decisioning. “Health, finance, insurance, even retail and ecommerce. That includes things like loyalty rewards or access to exclusive offers.”

The business process implications of the Act are wide-reaching. “Can you explain to your customers how the decision was made and what data was used to make that decision? For example, they could specifically ask what was it about my loan history that led you to the decision.”

Taylor also sees a resource crunch, not only for brands but for regulators. “It’s going to be a lot easier to make a complaint, but at the same time the Privacy Commissioner has said their budgets  have been reduced and their staffing has been reduced.”

For now, Taylor is cautious about the kind of work Digital Balance does around automated decisioning, preferring to focus on practical implementation matters for clients, rather than matters of policy “We want to stay away from those systems. We could say, yes you can use this engine, like Bupa was using Pega and Tealium combined to make the [decisioning] engines. We help them implement those systems, but we don’t actually help make the rules for those systems.”


Renewed scrutiny

According to Clayton Utz partner, Dean Gerakiteys, “By implementing legislation like this – whether it’s the tort, or other aspects of the reforms – and having Australian end-users be very conscious now of their individual rights, it’s educating them to start asking questions they might not have been asking before.”

He said that where the law firm’s clients are active in customer-facing industries, “their teams are going to be concerned about what their customers are concerned about. “

“The more their customers know they’re getting these individual rights, or are getting things close to what they’re seeing overseas, and the more they think there are significant reforms, that changes that discourse.”

That will force businesses to start thinking about the implications ahead of time. “Not just because they’re going to be law, but because they’re good business,” he said. “That always seems to drive behaviours, even before legislative change.” 

DB logo
DB logo
DB logo