The recent “hacking” incident involving Sri Lanka’s Ministry of Finance and the Treasury cannot be treated as a narrow technical glitch. It raises deeper questions about how public money is managed, who is accountable, and whether systems are designed to prevent—or enable—failure. When such an event occurs at the core of public finance, it does not remain an isolated IT issue. It becomes a test of institutional credibility. At stake is not only money, but trust—the invisible asset on which an economy rests.

Public communication around the incident has not helped. Instead of reducing doubt, it has widened uncertainty. When explanations are partial, delayed, or inconsistent, they create space for speculation. Markets dislike ambiguity. So do citizens. In the absence of clear facts, narratives compete, confidence weakens, and the perceived risk of the system rises. In this sense, poor communication can amplify the damage far beyond the original event.

This article therefore looks beyond the label of a “cyberattack.” It treats the incident as a system-level failure that sits at the intersection of technology, governance, and accountability. The goal is to identify what likely went wrong, what global experience already tells us, and what policy actions are necessary—not only to find the truth, but to restore confidence and prevent recurrence.

What is a “Hacking” incident? – A simple view

The term “hacker” often suggests a highly skilled outsider breaking into a system. In practice, most breaches are less dramatic and more mundane. They exploit weaknesses that already exist: unpatched software, weak passwords, poor access controls, or careless user behaviour such as phishing. These are not rare events. They are predictable outcomes of weak system hygiene.

Fully important is the role of internal access. Many serious incidents involve “insider access”—legitimate credentials used improperly, or privileges that are too broad and poorly monitored. Such access is harder to detect because it appears normal. It often bypasses external defences entirely.

For this reason, the key question is not simply “Who entered the system?” but “How was entry allowed?” That question shifts attention from the attacker to the system. It forces us to examine design, controls, and oversight. In other words, it moves the discussion from a technical story to a governance story.

Deeper questions raised by this incident

When a transaction of USD 2.5 million is involved, the issue cannot be reduced to a single breach. Financial systems—especially those handling public funds—are built with layers of control: approvals, audit trails, and separation of duties. These controls are meant to prevent exactly this kind of outcome. If a large transfer can occur despite them, then either the controls failed, were bypassed, or were never properly enforced.

This leads to a more important question: How was such an event permitted within the system? Was it a one-off technical error? A pattern of weak controls? Or a breakdown in oversight? Each possibility points to a different kind of failure, but all point to the same conclusion—this is not a simple incident.

Trust is the operating system of any economy. Once trust is weakened, the effects spread quickly. Citizens begin to question institutions. Investors reassess risk. Lenders demand higher returns. What starts as a technical incident can evolve into a credibility problem. And credibility, once lost, is difficult and costly to rebuild.

Concerns are compounded when responses are delayed or incomplete. If critical system access was known but not acted upon, or if disclosure to responsible authorities was postponed, the issue becomes one of governance. Timely reporting is not a formality; it is a control mechanism. When it fails, the system loses its ability to correct itself.

Key Arguments

1. Erosion of Institutional Trust

Trust in public financial institutions underpins economic stability. When information is unclear or inconsistent, confidence declines. This affects expectations, investment decisions, and the willingness to engage with the system. Over time, weak trust translates into weaker economic performance.

Information Asymmetry and Narrative Control

When full information is not shared, a gap emerges between what authorities know and what the public understands. This asymmetry allows simplified labels—such as “hacker”—to dominate the narrative. Complex issues become reduced to convenient explanations. The cost is delayed truth and prolonged uncertainty.

3. System Reality

Large-value transactions typically require multiple approvals, verifications, and recorded trails. If such a system allows a questionable transfer, it signals a deeper problem. Either controls are ineffective, monitoring is inadequate, or responsibilities are not clearly enforced. In any case, it points to a system weakness, not an isolated glitch.

4. Governance Over Technology

Most major cyber incidents succeed not because technology is absent, but because governance is weak. Accountability is unclear. Oversight is fragmented. Operational discipline is inconsistent. Without these, even advanced systems fail. The central lesson is simple: technology cannot compensate for poor governance.

International lessons

Global experience reinforces these points. Repeated incidents across different countries show a consistent pattern: the root cause is rarely technology alone.

The Bangladesh Bank heist demonstrated how weak internal controls can enable large unauthorised transfers through international payment systems. Monitoring and verification failures were as important as any technical breach.

The Banco de Chile incident highlighted the importance of real-time monitoring and rapid response. Delayed detection allowed attackers to move funds before controls could react.

mex ransomware attack showed that preparedness matters as much as prevention. Without clear response plans and leadership accountability, organisations struggle to contain damage once an incident occurs.

These cases are not isolated. They are lessons. They show that effective protection requires a combination of sound technology and strong governance. The critical question, therefore, is not whether such incidents happen elsewhere—they do—but whether those lessons have been learned and applied.

Real consequences

The visible loss in a case like this is financial. The real cost is broader.

First, public trust declines. When institutions appear uncertain or opaque, confidence erodes. This weakens the effectiveness of policy and administration.

Second, foreign investment becomes more cautious. Investors prioritise stability and transparency. Perceived risk rises when systems appear unreliable.

Third, borrowing costs increase. International markets price risk. Lower credibility leads to higher premiums, making financing more expensive.

h, financial stability can be affected. Doubts about institutions can influence liquidity, flows, and overall system confidence.

Over time, these effects accumulate. Growth slows. Development is constrained. The long-term cost exceeds the immediate loss.

Policy Response

A narrow technical fix will not suffice. The response must be comprehensive.

An independent investigation is essential. It must be credible, free from interference, and supported by both local and international expertise. The objective is to establish facts, not narratives.

A full forensic analysis is required. System logs, access records, and transaction trails must be examined in detail. The aim is to understand both the breach and the conditions that enabled it.

Transparent communication is critical. Regular updates and a final public report help rebuild trust. Silence or delay does the opposite.

Accountability must be clear. Where negligence, misconduct, or failure is identified, appropriate legal action must follow. Responsibility should not be diffused.

System reforms are necessary. Stronger controls—such as dual authorisation, multi-factor authentication, and real-time monitoring—should be standard, not optional.

Cyber security capability must be strengthened. Continuous monitoring, training, and regular risk assessments are essential.

Finally, legal and institutional frameworks need reinforcement. Transparency laws, digital governance standards, and protection for whistleblowers can improve long-term resilience.

Can government remain silent?

Silence is not neutral. It increases uncertainty.

When information is withheld or delayed, speculation fills the gap. Markets react. Confidence weakens. Trust erodes. In public finance, this is costly.

The response must be timely and clear. Facts should be disclosed. Responsibility should be assigned. Weaknesses should be corrected. The process must be seen as fair and independent.

If these steps are not taken, the issue will not remain contained. What appears to be a USD 2.5 million problem can evolve into a wider crisis of confidence. And once confidence is damaged, the cost of repair is far greater than the cost of prevention.

Strong systems depend on capable leadership and sound institutions. Positions of responsibility must be matched by competence and experience. Where gaps exist, they must be addressed.

In the end, the question is simple: will this incident be treated as a minor event to be managed, or as a warning to be acted upon? The answer will determine not only accountability for the past, but the credibility of the system going forward.

By Prof. Ranjith Bandara