All Articles
PLMdigital threadkey conceptssafety

Digital Thread, Safety Culture, and the Lessons of the 737 MAX

Michael Finocchiaro
Last updated: May 15, 2026

Key Takeaways

  • The digital thread is only as trustworthy as the organization maintaining it
  • Organizational safety culture is the biggest variable in PLM-enabled safety management
  • Formal change management systems only work if organizational incentives reward their honest use
  • Immutable audit trails in PLM are a necessary but not sufficient condition for product safety
Digital ThreadSafety CulturePLM and SafetyBoeing 737 MAXProduct Safety Management
Share

Short Answer

The digital thread can connect product data from design through service — but organizational safety culture determines whether that thread is maintained honestly. The 737 MAX showed that technical PLM capability cannot compensate for an organization that suppresses safety signals, fragments communication, or prioritizes schedule over data integrity.

  • The digital thread is only as trustworthy as the organization maintaining it
  • Safety culture determines whether PLM data is honest or optimistic
  • Formal change management only works if organizational incentives reward honest use
  • Immutable audit trails are necessary but not sufficient for product safety
  • The 737 MAX was an organizational failure, not a technical PLM failure

The Digital Thread and the Question It Cannot Answer

The digital thread promises something straightforward and enormously valuable: a connected record of a product's journey from concept through end of life, accessible to any authorized stakeholder at any point in the lifecycle.

It is a promise PLM systems can technically fulfill. The connective tissue exists — APIs, integration platforms, data models — to link design intent to engineering specifications, to manufacturing instructions, to service records, to field performance data.

What no technology can guarantee is whether the data flowing through that thread is honest.

That question — whether an organization is willing and able to maintain the digital thread with integrity — is determined by organizational safety culture. And the Boeing 737 MAX remains the most important case study we have in what happens when the answer is no.


What the 737 MAX Investigation Revealed

The 737 MAX accidents that killed 346 people in 2018 and 2019 produced investigations that documented organizational failures in remarkable detail. The Joint Authorities Technical Review, the House Transportation Committee report, and the Department of Justice settlement all point to the same pattern: an organizational culture that discouraged the honest communication of safety concerns through formal channels.

From a PLM and digital thread perspective, the failures were specific:

Incomplete engineering change documentation: The MCAS system was substantially redesigned late in the certification program, with its authority expanded far beyond what the original safety analysis had assessed. That change was not fully documented in a way that triggered re-evaluation of the safety case. The formal engineering change process existed; it was not fully used.

Siloed safety analysis: The safety analysis for MCAS was performed under assumptions — specifically, assumptions about pilot response time and system behavior — that were not connected to operational data or simulator testing that might have challenged them. The data existed in the enterprise; the thread connecting it to the safety case did not.

Informal resolution of safety concerns: Concerns raised by Boeing engineers about MCAS behavior were, according to the investigations, resolved through informal channels and management escalation rather than through the formal engineering issue tracking and resolution process. The formal process existed; it was routed around.

Certification record disconnected from operational reality: The FAA certification was based on documentation that, in critical respects, did not accurately represent the system that was delivered. The certification record and the engineering reality diverged — and the digital thread connecting them was incomplete.

Each of these is a digital thread failure. And each of them was enabled by organizational culture, not by technical capability.


Safety Culture as a PLM Variable

Safety culture is not typically in the scope of a PLM implementation. It should be.

An organization with strong safety culture uses PLM as it is designed to be used: as an authoritative record of what was decided, why, what concerns were raised, what alternatives were considered, and what was verified. Engineers log safety concerns as formal issues. Design reviews capture dissenting opinions. Change authorization records reflect actual impact analyses rather than schedule-optimized summaries.

An organization with weak safety culture uses PLM as a compliance artifact: a record that proves work was done, not a record that captures what actually happened. Concerns are raised informally and resolved without creating a record. Change records are created after decisions have already been made informally. Safety analyses are documented in the format required by regulation rather than in the depth required by reality.

The difference is invisible from the outside — both organizations have PLM systems, change management processes, and certification documentation. The difference becomes visible when something goes wrong and the investigation begins to ask: what did the people involved actually know, when did they know it, and what did they do with that knowledge?

An honest digital thread answers those questions. A compliance digital thread does not.


PLM Practices That Support Safety Culture

Organizations that want to use PLM to reinforce rather than merely document safety culture have a specific set of practices available:

Formal issue tracking for safety concerns: Require that any concern raised in a design review, safety analysis, or test result be logged as a formal issue in PLM with a tracked disposition. This creates an audit trail that makes it visible if concerns are being raised but not substantively addressed.

Requirements traceability as a gate condition: Require that every system requirement trace to a verified test result before a lifecycle gate can be passed. Requirements with open traces are visible and must be resolved or formally dispositioned — they cannot be ignored.

Immutable audit trails: Configure PLM so that engineering change records cannot be modified after approval, and that all changes to safety-relevant records create a new version rather than overwriting the old one. This makes the actual history of the design visible to investigators and auditors.

Separation of as-designed and as-certified records: Maintain explicit linkage between the engineering record and the certification record, and require formal change authorization when they diverge. The 737 MAX failures in part resulted from a certification record that did not track late-stage engineering changes.

Deviation and waiver tracking as first-class records: Treat every deviation from requirements and every waiver of standards as a first-class PLM record — not a footnote. Organizations with weak safety culture tend to treat deviations as administrative formalities; organizations with strong safety culture treat them as signals that the safety case needs to be re-examined.


What Organizational Culture Determines

Technology can make honesty easier. It cannot make it happen.

PLM systems can be configured to require documentation at every gate, flag concerns that go unaddressed, and create immutable records of every decision. All of that capability is rendered ineffective by an organizational culture that treats these requirements as obstacles to be minimized, workarounds to be found, and compliance artifacts to be generated as efficiently as possible.

The 737 MAX investigation described engineers who were aware of concerns and did not escalate them through formal channels because they did not believe escalation would be safe or effective. That is a safety culture problem. No PLM configuration solves it.

What PLM can do is make it harder to hide. Organizations that configure their digital thread to be complete and honest — requiring formal records for concerns, deviations, and resolutions; making those records visible to safety oversight; creating audit trails that investigators can actually use — are building systems that reinforce the culture they want rather than accommodating the culture they have.

But the culture itself must be built by leadership, through the incentive structures and behavioral expectations that determine what engineers actually do when they face a tradeoff between schedule and safety.

See also: PLM Implementation and Organizational Change Management for the organizational disciplines that enable PLM to function as designed.


Implications for PLM Implementation

For organizations implementing or upgrading PLM in safety-critical industries, the 737 MAX case has direct implications:

Scope your change management to capture informal decisions: If your PLM change management process only captures formal ECOs, you are missing the informal decisions that often drive the most consequential changes. Design PLM workflows to capture the full decision landscape, not just the formally routed one.

Build safety concern visibility into the system: Issue tracking, open items lists, and safety action items should be first-class objects in PLM — not managed in separate systems where they are invisible to the PLM record and to downstream safety oversight.

Treat traceability gaps as safety findings: A requirements trace that terminates at an analysis rather than a test result is a gap. A safety analysis assumption that is not connected to the data that should validate it is a gap. Configure PLM to surface these gaps rather than permit them silently.

Design for investigators, not just users: PLM records are eventually reviewed by people who were not in the room when the decisions were made — service engineers, quality investigators, regulators, legal teams. Design the information architecture so that the record is intelligible to someone who does not already know the story.


Summary

The digital thread is only as trustworthy as the organization that maintains it. The Boeing 737 MAX demonstrated, at enormous cost, that organizational safety culture is the variable that determines whether technically capable PLM systems function as intended.

PLM configuration can reinforce safety culture: by requiring formal records for concerns and deviations, creating immutable audit trails, enforcing requirements traceability, and making safety-relevant records visible to oversight. But configuration cannot substitute for the organizational commitment to using those systems honestly.

For product organizations in safety-critical industries, the most important PLM question is not which system to implement. It is whether the organization has the safety culture to use it with integrity — and what organizational work needs to happen before the technology investment can deliver its intended value.

Related reading:

Share

Want to listen instead of read? 56 DemystifyingPLM articles are available as audio.

Browse audio →

Looking up PLM terminology? Browse the canonical reference.

PLM Glossary →

Cite this article

Finocchiaro, Michael. “Digital Thread, Safety Culture, and the Lessons of the 737 MAX.” DemystifyingPLM, May 15, 2026, https://www.demystifyingplm.com/digital-thread-safety-culture

MF

Michael Finocchiaro

PLM industry analyst · 35+ years at IBM, HP, PTC, Dassault Systèmes

Firsthand knowledge of the evolution from early 3D modeling kernels to today's cloud-native platforms and agentic AI — the history, strategy, and future of PLM.