top of page
The  iluli by Mike Lamb logo. Click to return to the homepage
The iluli by Mike Lamb logo. Click to return to the homepage

Think Like the Enemy: Red Teaming for Strategic Advantage

If you’re hoping for a play-by-play of Sir Alex Ferguson’s tenure at Manchester United, you’ve come to the wrong place. The term “red teaming” has nothing to do with football. 

 

Instead, I’m talking about Bryce Hoffman’s 2017 book: Red Teaming: Transform Your Business by Thinking Like the Enemy, described as: "A revolutionary new way to make critical and contrarian thinking part of the planning process of any organisation, allowing companies to stress-test their strategies, flush out hidden threats and missed opportunities and avoid being sandbagged by competitors."

 

In other words: adopting an adversarial approach to rigorously challenge plans, policies and assumptions. 


A cartoon depiction of Thanos' gauntlet alongside the words "Red Teaming"

Sound familiar? Red teaming’s origins lie in the Roman Catholic Church, where "devil's advocates" were tasked with arguing against candidates for sainthood, regardless of their personal opinions. And before you even think it, no – we’re still not talking about the Red Devils. Or that Al Pacino / Keanu Reeves vehicle from the ‘90s, although it’s well worth a watch!

 

(Fun fact: In 2003, when writer Christopher Hitchens was asked to play the devil's advocate during Mother Teresa’s canonisation process, he jokingly remarked about the lack of payment, saying, "I can thus claim to be the only living person to have represented the Devil pro bono.")

 

Described as both a set of analytical tools and a mindset, red teaming is designed to help readers overcome the cognitive biases and mental blind spots that we all fall prey to when tackling complex problems; particularly as we keep one eye on the horizon, anticipating the next disruptor that could upend our industry…

 

Thinking like the enemy

 

When reflecting on past political, economic, or military failures, it doesn't take long to recognise familiar culprits: groupthink and complacency. Poor decisions often go unchallenged due to fear of authority, reluctance to deviate from peers, or simply because following orders and maintaining the status quo is the path of least resistance. These perennial human weaknesses are the focus of the book, which offers strategies to guard against them.

 

Hoffman, a strategic advisor and management consultant, explains: 

 

Thinking like the enemy is exactly what companies need to dispel groupthink and complacency and cope with a rapidly changing – and increasingly uncertain – world. 

Rather than taking the easy route or making decisions simply to fit in, red teaming helps us evaluate whether we are making the best choices in any given situation.


To acknowledge the need for red teaming is to acknowledge the fallibility of human decision-making:


Each of us, no matter how smart, or well-educated, or well-intentioned we may be, is unduly influenced by a dizzying array of cognitive biases and logistical fallacies that skew our decision-making and lead us in unintended directions.

The modern-day concept of red teaming – intensifying scrutiny on critical decisions – originated with the U.S. Army, as a response to, as Hoffman puts it, “how wrong they had been” in the bloody and catastrophic post-invasion experiences of Iraq and Afghanistan.


The military recognised that red teams must examine problems from multiple perspectives, and for this to work effectively, every red team member’s voice must be heard in a meaningful way. This can be challenging, especially within a hierarchical structure like the military.


An illustrated outline of military personnel advancing on a cityscape in the distance.

So, the red team’s job is to work together to come up with a better plan, right? Well, interestingly… no. A red team’s role is to make the existing plan better. Red teams should be “critical and contrarian,” but not necessarily right. The red team’s work isn’t designed to supplant the work of a company’s regular planning staff, but to be weighted alongside it.

 

The phrase "speaking truth to power" has become a cliché, but red teaming truly embodies this concept. It serves as a means to deliver the most honest assessments directly to those in positions of greatest authority, ensuring that crucial messages reach the top without being diluted or overlooked by middle leadership.

 

That’s all well and good for the army. But what about business?

 

Missed communication

 

In a June 2024 article for Forbes magazine, Hoffman explores a range of high-profile businesses who failed, speculating as to what went wrong and what an astute red teamer might have noticed. As you might expect, the behaviour of leadership was a good place to start… 

 

A workplace's culture is fundamental, and neglecting it can result in a toxic environment, high employee turnover, and diminished productivity.

 

Hoffman cites the cultural issues at Uber under former CEO Travis Kalanick as a case in point: 

 

Reports of harassment, discrimination, and unethical practices created a hostile work environment. These cultural problems not only damaged Uber’s reputation but also distracted the company from its core business operations, leading to leadership changes and significant internal restructuring.

 

Such troubling – and costly – issues could have been mitigated with insights from a red team.

 

Next up: communication, or lack thereof... Hoffman argues that the decline of Blackberry “can be partly attributed to poor internal communication and a lack of clear strategic direction.” When Apple and Samsung introduced groundbreaking new devices, “Blackberry struggled to communicate a compelling vision and adapt to the changing market dynamics. This miscommunication and strategic disarray led to the company's diminished market presence.” 


A red team could have alerted Blackberry to the threat posed by the sleek, intuitive iPhone – assuming, of course, their fingers weren’t too sore from typing on those tiny Blackberry keys!

 

In addition to miscommunication, Blackberry’s fatal flaw was their inability to innovate at a pivotal moment. But, as Hoffman notes, they weren’t alone: 

 

Companies that fail to innovate often find themselves outpaced by more agile competitors. Blockbuster’s demise illustrates this point. Despite having the opportunity to buy Netflix for $50 million in 2000, Blockbuster failed to recognize the potential of the online streaming model.

A cartoon image of a Blockbuster video store, with a Netflix aquisition contract alongside.

 

Sins of commission and omission


In their Harvard Business Review article, leadership consultants Jack Zenger and Joseph Folkman suggest several ways leaders can learn to identify their weaknesses. They propose that top-level "failings" fall into two broad categories. The first, "sins of commission," involve direct mistakes such as making poor decisions, completing ineffective projects, or following misguided examples.

 

However, Zenger and Folkman argue that "sins of omission" can be even more damaging. These occur due to inaction, where leaders fail to take necessary steps. Common issues include a lack of strategic thinking, failure to take responsibility for outcomes, and insufficient effort in building strong relationships:

 

Because most fatal flaws are sins of omission, they are harder for us to see in ourselves. The result, after all, is not visible. It’s a deal that never happens, or a project that doesn’t exist. These leaders are simply not making things happen.

 

Red teaming COVID and AI

 

The need for decisive action was never more critical for political leaders than during the COVID-19 pandemic. As reported by The Guardian, Lady Hallett, who chaired the UK’s recent COVID inquiry, noted that the advice the UK Government received was too focused on biomedical science, at the expense of foreseeing social and economic consequences. To address this, she recommended the use of red teams “partly staffed with non-experts skilled in critical thinking and incisive challenge.” She also felt that being more alert to groupthink to “guard against the risks of conventional wisdom becoming embedded in the institutions responsible for emergency preparedness and resilience” would have led to better outcomes in the UK. Hallett concluded: 


Red teams should be used far more regularly and systematically across government advisory and decision-making structures relating to emergency preparedness and their views conveyed to ministers. In this way, ministers, rather than an internal consensus, will determine emergency preparedness, resilience and response policy. Governments and their institutions should be open to potentially unconventional thinking.

So, what should be the priorities for red teaming in the immediate future? One area that requires scrutiny is the growing influence of AI in our daily lives. Red teaming efforts have already uncovered numerous weaknesses, biases, and even potentially life-threatening concerns.

 

A chilling example of AI’s potential risks is highlighted in a piece by Madhumita Murgia in the Financial Times. A chemical engineering expert, part of a 50-member red team, used ChatGPT to propose “an entirely new nerve agent”. When another red team member explored how the technology could be employed in a cyber-attack on military systems, she was taken aback by the level of detail it provided, saying, “I wasn’t expecting it to be quite such a detailed how-to that I could fine-tune.”


Another red team member discovered a concerning bias in the AI's performance: the quality of information generated varied significantly depending on the input language. The so-called "hallucinations" – instances where the chatbot produces fabricated information – were notably worse when the model was tested in Farsi. This resulted in a higher proportion of "made-up names, numbers, and events" compared to when the model was used with English.


A cartoon of a computer (representing AI), reading books and digesting the information.

Given such examples, it’s not surprising that the Biden administration issued an executive order to establish safeguards for AI. As Andrew Burt highlights in the Harvard Business Review, one of the key requirements of the order is that certain high-risk generative AI models undergo red teaming: “a structured testing effort to find flaws and vulnerabilities in an AI system.”


Yet, red teaming a complex system like AI presents its own set of challenges. For instance, testing for bias often requires provoking the system with leading questions or problematic assumptions. This raises several questions: Who determines which questions are asked? Who asks the questions? And what constitutes an appropriate or proportional level of provocation in such an exercise? 

 

Conclusion


Given the very real mistakes, blind spots, and dangers we face, it’s no wonder that the term "red teaming" has transcended abstract business jargon and is now actively applied in military, healthcare, and tech sectors. Now, more than ever, it's crucial to empower these professional, sceptical voices and stress-test for complacency and groupthink.


And the next time the contrarian in your life frustrates you with their doubts and anxieties (“have you considered supporting a different football team?”), remind yourself how fortunate you are to have their bespoke red teaming service. In this digital age of finely tuned algorithms and biases that reinforce the familiar, these fresh human insights may soon prove even more precious.


Comments


bottom of page