Cyber risk psychology
October 2018 | PROFESSIONAL INSIGHT | RISK MANAGEMENT
Financier Worldwide Magazine
October 2018 Issue
Psychological phenomena often give cyber predators an important edge over their prey. Some of the most important phenomena are aspiration-based risk taking, excessive optimism, overconfidence, confirmation bias and availability bias. These phenomena give rise to pitfalls that infect organisational processes.
Availability bias is especially pernicious. It involves overreliance on information that is easily recalled, and in respect to cyber risk, generates a ‘see no evil’ state of mind. A good way to combat availability bias is to use case studies to ‘see evil’ which impacted others, with the evil comprising both the basic threat and the underlying psychological vulnerabilities. The point of the case study approach is to analyse how the ‘type of risks’ described in the cases ‘could happen here’ within the organisations’ processes, and if identified, how to respond effectively.
One way of seeing the power of the ‘process-pitfall’ approach is to trace the history of firms. If we examine a major oil spill in BP’s Alaska pipeline in 2008, which resulted from the firm’s underinvestment in maintenance, we can see that BP’s culture was problematic. Two years later, in 2010, the BP Deepwater Horizon explosion made front page headlines around the world, and a US presidential commission identified BP’s poor culture as a root cause. In 2008, the Ford Motor Company was trying to restore its profitability by improving its information sharing culture, especially around recognising and dealing with problems. By 2012, Ford had not only become profitable again, but was ranked number 14 on Fortune‘s Global 500 ranking of the world’s most profitable companies.
Nortel Networks was a firm whose processes were infected by psychological pitfalls. The firm entered bankruptcy in 2009, having been the subject of a major cyber attack. During the ‘dotcom’ bubble, Nortel was a successful multinational telecommunications equipment manufacturer. In 2000, Nortel was Canada’s most valuable firm.
Aspiration-based risk taking is a psychological trait in which an organisation’s risk appetite depends on how it views its current situation, relative to the factors capturing the attention of its managers, especially its goals. The evidence suggests that risk appetite is large for firms whose executives set high aspirations that have yet to be met. By way of contrast, risk appetite is much lower for firms whose executives judge that either they have met their goals, or instead believe that their firms might not survive.
With aspiration risk in mind, think about the following: in August 2000, Nortel’s market capitalisation peaked at $242.8bn. However, in September 2002, after the bursting of the dotcom bubble, the company’s market capitalisation plummeted to $2.1bn. In the wake of the decline, the firm made two-thirds of its workforce redundant and slashed executive bonuses. Aspiration-based risk taking predicts that in such a situation, executives are prone to respond to declines by taking higher, if not imprudent, risks.
Whatever it was that executives did, in the first quarter of 2003, Nortel achieved an unexpected return to profitability. The firm reported positive earnings, an achievement that resulted in $70m of bonuses for the firm’s top 43 managers. Unfortunately, those bonuses occurred as a result of executives having manipulated earnings. The manipulation led to a charge of accounting fraud, court proceedings against the firm’s CEO, and the subsequent restatement of the firm’s financial statements. Most accounting scandals are directed from the top, and in Nortel’s case the top executives were the CEO Frank Dunn, the chief financial officer Douglas Beatty and the controller Michael Gollogly.
In April 2004, Nortel’s board fired Mr Dunn, Mr Beatty and Mr Gollogly, although the court cleared Mr Dunn of the charges against him. Over the next 18 months, the firm hired and fired several CEOs, before choosing Michael Zafirovski as a permanent appointment. In 2007, the financial press used the phrase ‘forceful optimism’ to describe Mr Zafirovski, a characterisation which suggests that the firm’s risk appetite continued to be driven by a focus on high aspirations.
However, if the events of 2004 did not produce enough excitement, a Nortel employee uncovered a cyber breach at the firm. The discovery was serendipitous. A UK employee noticed that a senior executive in Canada, Brian McFadden, had downloaded work documents from the company server that were irrelevant to his area of responsibility. An email exchange between the two revealed that Mr McFadden had not requested the download, suggesting a cyber breach. An internal investigation followed and revealed that hackers had obtained the passwords of seven top executives, including former Mr Dunn. The infiltration had begun in 2000 and involved China-based internet addresses.
Just as BP had underinvested in pipeline maintenance that would guard against an oil spill, key Nortel managers reacted to the cyber attack in a manner that could be described as tepid. They instructed that passwords be reset, performed minor monitoring and then judged that the hacking problem had been solved. As it happens, that judgment was in error, as Nortel was in the midst of being the victim of a major cyber theft, which conceivably led to the demise of the firm. To its credit, Nortel did alert the national police force, but the RMCP declined to launch a major investigation.
One of the main objectives of this article is to ask what psychological phenomena were at work, and what others might learn about cyber security from Nortel’s missteps.
Excessive optimism involves attaching too high a probability to a favourable outcome, relative to an unfavourable one. Certainly, the key decision makers at Nortel underestimated the probability of the unfavourable consequences from the cyber attack. Overconfidence about ability involves overestimating one’s own ability. Certainly, the key decision makers at Nortel overestimated their ability to manage the attack. Confirmation bias and motivated reasoning are psychological phenomena that involve people ignoring information that runs contrary to their views. As it happens, these two phenomena played crucial roles in how, after 2004, the cyber threat unfolded at the firm.
One person at Nortel, Brian Shields, was the main driver of the firm’s investigation into the cyber attack. In 2007, Mr Shields discussed his concerns with the firm’s vice-president of corporate security and its director of corporate and systems security. The two executives responded by suggesting that Mr Shields prepare an audit report, which he did. However, when he submitted the report in 2008, the two vice-presidents chose to sit on it, rather than send it to senior executives and the board. This was BP-like behaviour, not Ford-like behaviour, and reflective of confirmation bias.
Undeterred, Mr Shields engaged a cyber expert to help him identify what Nortel’s own anti-malware specialist had not. In particular, they were able to identify locations in China from which the attack emanated. The two also concluded that the attack was the work of professionals who had utilised ‘rootkits’, which are malicious pieces of software designed to render nearly invisible processes running on a device.
Mr Shields speculated that the purpose of the cyber attack was industrial espionage, designed to steal Nortel’s proprietary intellectual property. He further suspected that the attack was orchestrated by the Chinese firm Huawei Technologies Co. Ltd. which, quite suddenly, emerged as a fierce competitor.
In 2008, Mr Shields found evidence that Mr Zafirovski’s computer activity was being monitored by the cyber thieves. However, his attempts to communicate his concerns to Mr Zafirovski were blocked by Nortel’s IT security manager, who argued that the CEO was preoccupied with other important issues.
Those other issues might well have been dealing with the effects of price competition from Huawei and other Chinese firms, as well as softening demand during the lead up to the global financial crisis. In early 2009, Nortel declared bankruptcy, and eventually sold its assets to Avaya Inc., Ciena Corp., Telefon AB L.M. Ericsson and Genband. Notably, the firm’s executives did not disclose to its buyers that some of those assets might have been compromised by hackers. Perhaps the firm’s executives were unaware, or perhaps they preferred not to know. Regardless, the sale of those assets made possible the continuation of a hack that began in 2000.
The point of case studies is to fight availability bias, by making important information about cyber breaches available. It is even better for executives to engage in well-defined programmes where they can ask themselves if there, but for the grace of God, do they go. Doing so means facing up to how their firms are impacted by excessive optimism, overconfidence, confirmation bias and aspiration-based risk taking. Yet even better, is bringing in outside experts to provide independent voices to shift attention from high aspiration goals to downside protection and security.
Hersh Shefrin is the Mario L. Belotti Professor of Finance at the Leavey School of Business, Santa Clara University. He can be contacted on +1 (408) 554 6893 or by email: email@example.com.
© Financier Worldwide
Santa Clara University