Key Takeaways
- Interoperability is an architecture decision, not a feature — it must be designed in before system selection, not retrofitted after go-live
- STEP AP242 and JT Open reduce format risk but do not eliminate the semantic gap between what vendors claim their exports contain and what downstream systems can actually consume
- Data fabric and API-first PLM platforms are reducing the cost of cross-system integration but require careful governance to avoid data drift
- Manufacturers that treat interoperability as a vendor problem will keep paying for it as a systems integration problem
Short Answer
Product data interoperability means ensuring that engineering, manufacturing, and supply chain systems can exchange accurate product data without manual re-entry or format conversion — and in 2026, failing to achieve it is a measurable competitive disadvantage.
- STEP AP242 has become the de facto neutral format for model-based engineering across aerospace, automotive, and defense programs
- Data fabric architectures treat interoperability as an infrastructure layer rather than a one-off integration project
- PLM integration platforms (Aras, Propel, middleware like Jitterbit) are absorbing what was previously custom connector work
- Vendor lock-in manifests most painfully at program boundaries — handoffs to suppliers, customers, and regulatory bodies
- Interoperability in the demo rarely survives contact with real multi-vendor production environments
The most expensive line item in most PLM programs is not the software license. It is the integration work required to make that software talk to everything else. PLM vendors have understood this for years, and their response has been a combination of increasingly sophisticated proprietary connectors, marketplace ecosystems, and interoperability claims that sound better on a slide than they perform in production. In 2026, the gap between what vendors promise and what manufacturers experience is closing — but not because the vendors fixed it. It is closing because the manufacturing industry ran out of patience and started building around the problem.
How We Got Here
Product data silos are not an accident. They are the natural result of two decades of best-of-breed tool selection without an interoperability strategy. A typical mid-market manufacturer in 2010 ran SolidWorks for mechanical design, a separate electrical CAD tool, an ERP system that predated PLM by a decade, and whatever the quality team had bought independently. Each system was selected for its functional merit. None of them were selected with the question "how will product data flow between all of these?" as a primary criterion.
The result was a patchwork of proprietary exports, manual re-entry steps, and one-off integration scripts maintained by engineers who had long since left the company. IGES files lost tolerance data. STEP exports missed assembly constraints. BOM exports required manual reformatting before ERP import. The cost of these gaps was real but diffuse — spread across hundreds of engineer-hours per quarter, never showing up as a single line item on a project postmortem.
The shift began in earnest with the aerospace and defense sector's adoption of model-based engineering (MBE) in the 2010s. When Boeing, Airbus, and their tier-1 suppliers began requiring that engineering deliverables include full PMI-annotated 3D models rather than 2D drawings, the interoperability problem became acute. A supplier with SolidWorks could not simply hand a CATIA-native model to a customer using NX. STEP AP242 — ratified in 2014 — emerged as the answer, encoding geometry, PMI, and assembly structure in a neutral format that any compliant system could consume.
The Current Landscape
By 2026, STEP AP242 has become the baseline requirement for new aerospace and defense programs. Automotive OEMs are adopting it for electric vehicle platform programs, where multi-tier supply chains involving hundreds of suppliers make proprietary format exchanges impractical. Industrial equipment manufacturers are following, driven by pressure from customers who need to maintain and operate equipment for 20–30 years after the original CAD tool has been superseded.
JT Open — the lightweight visualization format originally developed by Siemens and now an ISO standard (ISO 14306) — has become the parallel standard for visualization and digital mockup workflows. Where STEP AP242 carries the authoritative engineering data, JT carries the lightweight representation used for supply chain collaboration, digital twin visualization, and augmented reality applications. The two formats are complementary: STEP for accuracy, JT for accessibility.
Beyond neutral formats, the integration platform market has matured significantly. Middleware vendors like Jitterbit, MuleSoft, and Boomi now offer pre-built connectors for the major PLM platforms. Aras Innovator's open platform architecture has made it a preferred integration hub for manufacturers running multi-vendor PLM environments — its low-code configuration model lets IT teams build and maintain connectors without depending on vendor professional services. Propel PLM, built natively on Salesforce, inherits the Salesforce integration ecosystem, giving it connectivity to CRM, CPQ, and service cloud data that traditional PLM systems require custom work to access.
The emerging concept of a "Digital Thread Hub" — a central service that maintains the relationships between product data objects across systems without owning the data itself — is gaining traction in analyst discussions and early enterprise architectures. Rather than routing all product data through a single system of record, a Digital Thread Hub maintains a graph of what data exists where, enables cross-system queries, and enforces consistency rules at the relationship level. This is the data fabric pattern applied specifically to product lifecycle data.
For PLM integration practitioners, the practical implication is a shift from "which system owns the data?" to "how do we govern data relationships across systems?" That is a harder question to answer, but it is the right question.
Use Cases and Business Impact
Multi-Tier Supply Chain Handoffs. A European aerospace tier-1 supplier building structural assemblies for three different OEMs — one using CATIA, one using NX, one using Creo — previously maintained three separate CAD environments to satisfy customer format requirements. With STEP AP242 as the contractual delivery format and bidirectional translation workflows built into their PLM configuration management process, they collapsed to a single authoritative NX environment. Translation happens at the delivery boundary, not throughout the engineering process. Result: 40% reduction in rework caused by format-related data loss, and elimination of the "which version did we send them?" question that previously consumed hours of program management time per week.
Regulatory Submission and Long-Term Archiving. A medical device manufacturer required to maintain complete product records for the life of the device — potentially 30+ years — switched from proprietary PLM exports to PDF/A-3 with embedded STEP AP242 geometry for all design history records. When their PLM vendor discontinued a product line and forced a migration, the archived records remained fully readable in the new system without any conversion project. The data governance benefit was substantial: archival format stability removed an entire category of data migration risk from their platform roadmap.
M&A Integration Speed. A $2B industrial equipment manufacturer acquiring smaller competitors found that PLM data migration was consistently the longest-lead activity in post-merger integration — 12–18 months of mapping, conversion, and validation work per acquisition. By establishing a neutral-format intermediate layer (STEP AP242 for geometry, a standardized BOM JSON schema for structure) as the canonical form for acquired product data before any system migration, they reduced integration timelines to 4–6 months. The interoperability architecture became a repeatable M&A playbook asset.
Barriers to Adoption
The barriers to real interoperability are more organizational than technical. The technical standards exist and are mature. The tools to implement them are available and commercially supported. What is missing in most organizations is the governance structure to enforce their use.
Semantic gaps persist even with neutral formats. STEP AP242 can carry the geometry and the PMI annotations, but it cannot enforce that the sending system and the receiving system agree on what a tolerance zone means in the context of a specific manufacturing process. Interoperability at the format level does not guarantee interoperability at the meaning level.
Vendor incentives misalign with customer interoperability goals. PLM vendors profit from deep platform adoption. Every customer workflow that depends on a proprietary extension is a switching cost. Vendors participate in standards bodies and support neutral formats in their products, but they have no financial incentive to make their platforms easy to exit. Buyers must read vendor interoperability claims skeptically and test them against their own data, not the vendor's demonstration data.
Legacy product data is the hardest problem. New programs can be started with STEP AP242 as the delivery format. Existing product lines with 10–20 years of CAD history in proprietary formats represent a data archaeology problem that no standard solves automatically. Selective migration — identifying the active programs worth migrating and archiving the rest in read-only repositories — is usually the pragmatic answer, but it requires a governance decision that many organizations defer indefinitely.
Adoption Timeline
Phase 1 (Year 1): Standards baseline and audit. Establish which neutral formats your current PLM and CAD tools can import and export, and what fidelity is preserved. Run a real round-trip test: export a representative assembly to STEP AP242, import it into a second CAD tool, and document what was lost. This baseline is sobering but necessary.
Phase 2 (Years 2–3): Format governance and supply chain rollout. Establish neutral format delivery requirements for new supplier contracts. Build translation workflows into your PLM change release process so that neutral format exports are generated automatically at release, not on request. Evaluate integration platform options for the highest-friction system boundaries (PLM to ERP, PLM to MES).
Phase 3 (Years 3–5): Data fabric architecture. For organizations with multi-system PLM environments — common after acquisitions or parallel product divisions — evaluate data fabric or Digital Thread Hub architectures that provide unified product data access without requiring all data to reside in a single system. This is a multi-year architectural investment, but the organizations that make it stop paying the integration tax on every new program.
Future Outlook
The 2–5 year horizon for PLM interoperability is shaped by three forces. First, regulatory pressure: the EU's Ecodesign for Sustainable Products Regulation and the Digital Product Passport mandate machine-readable product data that must be accessible to regulators, recyclers, and customers throughout a product's life — creating a legal requirement for interoperable data that did not exist five years ago.
Second, the maturation of AI in engineering: AI systems that assist with design, simulation, and manufacturing planning require access to complete, structured product data. Proprietary silos that block cross-system data access also block AI augmentation — creating a new business case for interoperability that is easier to quantify than the traditional "reduce integration cost" argument.
Third, the consolidation of the integration platform market: as Jitterbit, Boomi, MuleSoft, and emerging PLM-specific integration vendors compete for the integration layer, the cost of point-to-point connectivity is falling, and the quality of pre-built connectors is rising. The digital thread vision — a connected, traceable data chain from design through manufacturing through service — becomes more achievable as the integration infrastructure matures.
Manufacturers that treat interoperability as a technical detail owned by the IT department will continue to pay integration costs on every program. Manufacturers that treat it as a strategic architecture decision will compound the value of that investment across every acquisition, every supplier relationship, and every new AI tool they adopt.
Related Resources
- PLM Integration Architecture — understanding the layers of PLM connectivity
- Configuration Management in PLM — how interoperability and version control interact
- PLM Data Governance — the governance layer that makes interoperability sustainable
- Digital Thread Architecture — the end-to-end data connectivity vision interoperability enables
- Enterprise PLM Rollout Guide — where to address interoperability in a large-scale deployment
Want to listen instead of read? 56 DemystifyingPLM articles are available as audio.
Browse audio →Looking up PLM terminology? Browse the canonical reference.
PLM Glossary →Cite this article
Finocchiaro, Michael. “Product Data Interoperability: Why PLM Silos Are Becoming a Competitive Liability.” DemystifyingPLM, May 16, 2026, https://www.demystifyingplm.com/plm-trend-interoperability
PLM industry analyst · 35+ years at IBM, HP, PTC, Dassault Systèmes
Firsthand knowledge of the evolution from early 3D modeling kernels to today's cloud-native platforms and agentic AI — the history, strategy, and future of PLM.