When Volkswagen's emissions fraud was eventually traced back to its origins, the question everyone asked was how the engineers who built the defeat device had made such a decision. The answer was not that they were dishonest people. It was that they operated in a culture where the cost of reporting a problem that leadership did not want to hear was perceived as higher than the cost of engineering around it. The fraud was the rational choice given the incentive environment they inhabited. Twenty-eight billion dollars in fines, settlements, and recall costs later, the failure mode was clear: it was not a strategy problem, a technology problem, or an ethics problem in the abstract. It was an information flow problem produced by fear.

The same structure produced Wells Fargo's unauthorized account scandal. Frontline employees who could not meet sales quotas without creating accounts that customers had not requested made the same calculation — push back on the targets and face termination, or comply and participate in fraud. More than 5,300 employees were fired. The strategic information about what was actually happening inside the business existed throughout the organization. The climate made surfacing it irrational.

The mechanism that operates at lower stakes everywhere

These cases are dramatic enough to be examined as cautionary tales. But the mechanism they illustrate — the calculation that silence is safer than candor — operates continuously in every organization where the cost of speaking up is higher than the cost of staying quiet. It does not produce fraud at most organizations. It produces something less visible and equally destructive: the systematic suppression of the critical information that strategy depends on.

Execution problems that are known but not escalated. Assumptions in the strategic plan that implementation leads know are false but have not challenged. Stakeholder objections that are circulating informally and are not reaching the people designing the solution. In each case, the knowledge exists. The climate makes its expression feel risky. The organization proceeds on information that the people closest to the problem have already invalidated in private.

A meta-analysis of 136 independent samples including more than 22,000 individuals found a significant positive correlation between psychological safety and team innovation behavior, with a correlation of 0.43 at the individual level.M.T. Frazier, S. Fainshmidt, R.L. Klinger, A. Pezeshkan, and V. Vracheva, "Psychological Safety: A Meta-Analytic Review and Extension," Personnel Psychology, 2017

The incentive audit most organizations avoid

The most reliable predictor of whether an organization has this problem is not its employee engagement score. It is whether its performance incentive structures contain mechanisms that inadvertently make concealment the rational choice. Wells Fargo's incentive structure was a legal document — it rewarded account creation directly. Volkswagen's culture was a set of unwritten expectations about what engineers were required to achieve and what the consequences of failure were.

In most organizations, the equivalent mechanisms are subtler. Leaders who visibly punish the messenger rather than the problem. Performance reviews that reward hitting targets without examining how. Escalation protocols that create career risk for the person who surfaces the bad news. Cultures where optimistic reporting is rewarded and honest assessment is characterized as "not a team player." None of these require malicious intent. They are the byproduct of organizational systems that were optimized for something else and produced a safety problem as a side effect.

Paul O'Neill's focus on worker safety at Alcoa — treating it as the one metric that could not be compromised — grew the company's market value from $3 billion to $27 billion during his tenure. Net income quintupled.Charles Duhigg, "The Power of Habit," Random House, 2012

What Paul O'Neill understood that most leaders miss

When Paul O'Neill became CEO of Alcoa in 1987, investors were alarmed by his decision to make worker safety his single organizational priority. The company's profitability concerns seemed more urgent. O'Neill's reasoning was different: worker safety was a systems problem that required every person at every level of the organization to surface information quickly, accurately, and without fear of penalty for what that information revealed. Building the conditions for that honesty, he understood, would also build the conditions for the honest communication that financial performance required. He was right on both counts.

The relationship between psychological safety and organizational performance is not about creating a comfortable environment. It is about creating the conditions under which the information that strategy requires can actually reach the people who must act on it. Organizations that treat this as a cultural preference will continue to design strategies in one climate and execute them in another.