Aftermath of the Optus Breach | information age

…Or not. Photo: Shutterstock

In the aftermath of the Optus hack, given that Optus didn’t – and still doesn’t – do a great job in handling the public relations of the incident.

I want to reflect on the violation, list what I know, what I don’t know, and what I have learned so far, and recommend.

First, my all-time favorite quote from Air Crash Investigation: “Incidents and breaches don’t happen by chance, they’re a chain of failing controls.”

Break any failure in the chain and this data breach could have been avoided.

So what do we know about the Optus breach so far?

  • From the little information we have, it was not a sophisticated attack was a simple and almost direct attack. People with basic API knowledge can execute such attacks quite easily
  • It looks like the API doesn’t have a rate limit. I don’t know what the intention of the API was in the first place, but the lack of rate limiting certainly allowed the attacker to exfiltrate a large amount of data before being detected
  • It seems that the API, before being published on the Internet, did not undergo any penetration testing that could have easily detected the problem.
  • An API without authentication, in 2022, really?
  • It appears that Optus has little or ineffective data classification processing procedures
  • From the published data, it also appears that Optus has a weak or ineffective data retention policy. There were too many leaked sensitive records that should have long since disappeared
  • It appears that Optus has little to no effective monitoring of their web infrastructure
  • It appears that Optus has not established proper data masking for sensitive data. I would bet this database is easily accessible within the Optus network
  • Suggested penalty under Australian privacy law may be less than Optus currently spends on stationery
  • Essential Requirements 8 would not have prevented the breach from occurring. Australian companies should consider Essential 8 as the bare minimum, i.e. the floor instead of the ceiling of their security program
  • Regardless of how I feel about Optus, I wish it was a unique case, but it’s not. From my experience and exposure, I know for a fact that many other private and public entities (local, state and government) have terrible security controls and data management, much worse than what Optus currently has
  • The regulatory requirement for data retention is convoluted, confusing and complex. Simply visit Public Records of Victoria (PROV) and search for ‘data processing’ – you will find many retention requirements that contradict and confuse anyone who tries to venture there.

What we don’t know:

  • How long this portal has been available to the public. He could have been sitting there for years
  • If someone else detected the same portal and extorted this data without triggering an alert
  • Who within Optus and its large partner web ecosystem has had access to this data
  • If anyone at Optus has made an extract of this data into Excel for legitimate business purposes
  • How many replicas of the same sensitive customer data or the same portal lurk around Optus systems and networks
  • If there are other publicly accessible portals that could expose similar sensitive data or are connected to the same backend database
  • I’m pretty sure, but can’t be 100% sure, that someone at Optus detected and attempted to report this issue only to be burdened with bureaucracy in the organization.

What I think the lessons learned from this are:

  • The payment gateway for processing sensitive credit card payments has been around for years now. I don’t understand why personal information gateways aren’t a thing
  • There should be a unified cybersecurity standard or best practice that Australian businesses should adhere to. This would apply to companies above a certain size or turnover, or if they process sensitive data relating to their size and turnover
  • The penalties imposed on large companies like Optus should be proportional to the records and the data breach. For example, IBM’s Data Breach Calculator estimates that, on average, a breach costs organizations $140 per record. In my opinion, companies should be subject to a penalty equivalent to the same amount
  • Data processing and classification for any entity, large or small, that stores or processes sensitive data should be mandatory
  • The data retention policy should be easy, simple and mandatory. Companies cannot and should not store sensitive data indefinitely “for historical or marketing purposes”. They are like cardboard boxes in your garage – once they serve their purpose they are a fire hazard and should be disposed of.

Final Thoughts

While this breach is the sum of multiple control failures, human error, and somehow an accumulation of failing controls, unfortunately it won’t be the last. I strongly believe that organizations, large or small, need to find a best practice that they should adopt to ensure that they have layered controls that can detect, protect and monitor events and incidents to prevent these attacks to occur.

Louay Ghashash is Chair of the ACS Cybersecurity Committee. He has over 22 years of information security experience across multiple industries and has served as Chief Information Security Officer (CISO) in a number of non-profit organizations. profit, retail and ISP. Louay has a strong track record of providing security advice to senior executives and the board.