By Sean Hojnacki
Despite our hard-won evolutionary gains, humans struggle to assess risk accurately. That stubborn truth stuck out for me in two tragic tales spaced decades apart: the fatal Lake Tahoe avalanche in February 2026 and the 1999 plane crash that claimed John F. Kennedy Jr., his wife Carolyn, and her sister Lauren.
Whether you watched Ryan Murphy’s Love Story, which stretched nine vibey episodes out of JFK Jr. and Carolyn’s star-crossed relationship, or you remember the headlines, or you learned of the story more recently, there are lessons we can all take from their ending.
All too often, our brains betray us; we struggle to admit it and adjust. That dynamic also seems to have contributed to the deadliest avalanche in modern California history that killed nine experienced backcountry skiers.
By acknowledging the ways our cognitive biases cloud our judgment, especially during a crisis, we can take steps to counter that with actionable information and avoid unnecessary consequences.
Get-there-itis and the Graveyard Spiral
Also known as plan continuation bias, ‘get-there-itis’ describes our tendency to ignore risks and persist with a flawed plan. It’s a cousin to the sunk cost fallacy, and it can lead to dire consequences.
JFK Jr. didn’t crash because of a mechanical failure. He succumbed to what NTSB investigators cited as “spatial disorientation.” On a foggy night over water, without a clear horizon, and without an instrument rating in a plane he was inexperienced at flying, JFK Jr.’s inner ear likely misguided him. It’s a phenomenon known as “the leans,” which can lead pilots to falsely correct from level flight and enter a banked turn, tightening it as they attempt to adjust but resulting in increased airspeed and vertical descent.
They’d also departed later than planned and flew into challenging conditions, but he’d piloted that route before and felt confident he would do it again. These perceptual blind spots resulted in tragedy.
FACETS of Flawed Decision-Making
The New York Times highlighted a similar phenomenon in the tragedy at Perry’s Peak, which occurred on a day when the Sierra Avalanche Center advised that “the chance of human-caused avalanches had risen from likely to very likely.”
Unfortunately, safety experts have found that “accidents have as much to do with failures in human decisions as they do with failures in snow layers.” Despite decades of improved forecasting and more sophisticated safety equipment, avalanche fatalities have not decreased. That’s why education efforts have shifted focus to the human factors that can lead to taking unnecessary and unintended risks.
We tend to normalize risk-taking, especially when we haven’t experienced negative consequences yet, or we’re familiar with our surroundings, or we’re following an “expert” leader. Group dynamics also play a significant role. Avalanche research shows that risk grows substantially in groups of six to 10, due to an illusion of safety in numbers and a tacit competitiveness that quells dissent and pushes acceptable boundaries.
Avalanche experts have summarized the common heuristic traps with the shorthand FACETS: Familiarity, seeking Acceptance, Consistency with stated goal, Expert halo, first Tracks/scarcity, and Social facilitation.
These dynamics can be present in business and communication settings as well. For example, consider the pressures for social facilitation and seeking acceptance when dealing with a board or executive committee, or the scarcity of acting on a time-sensitive opportunity, or the expert halo around a company founder or expert litigator.
Stress, Bias, and Unwarranted Risks
The NTSB report also cited the Aeronautical Information Manual, which notes that “stress from everyday living can impair performance, often in very subtle ways,” which can lead to taking “unwarranted risks.” Alas, we all experience those stresses, which makes us all susceptible to such risks.
As psychologist and backcountry skier Sara Boilen framed the tendency to normalize taking risks: “It’s very hard to avoid. … You can creep past a red line you would never intentionally step across.”
Even in relatively trivial circumstances, we often make consequential decisions through a fog of cognitive bias. In an age of accelerating AI adoption, consider the automation bias, which acts as a digital “expert halo.” We tend to ascribe too much authority to automated systems, such as trusting algorithmic or LLM-generated output over our own judgment, even when the output contradicts other evidence.
Judgment and Humility in Crisis Management
In law and communications alike, experienced professionals bring their judgment to bear on risk management and mitigation. Under duress, rationality can desert us, leading to a series of compounding errors. That’s when it’s more important than ever to gather input and make a clear-headed assessment.
During a reputational crisis, businesses can deal with their own version of “the leans,” reacting to misperceptions and hastily overcorrecting, exacerbating the situation. A trusted communications team can serve as the instrument panel, collating information to guide the way forward.
Respecting these aviation and avalanche tragedies requires learning their hardest lesson: The most dangerous condition during a crisis is overconfidence, but remaining receptive to candid counsel can prevent creeping past the red line, or spiraling out of control.
