ProveIt! 2026 — Key Learnings
ProveIt! is the 4.0 Solutions / Walker Reynold's annual industrial operations conference. This year it drew 51 software vendor sponsors and hundreds of manufacturers to Dallas for five days of live demos, keynotes, and honest conversation about what's actually working on the factory floor. I stayed for 2 1/2 days and already regret having to miss the last 2 1/2 days!
The "ProveIt!" Philosophy: Stop Selling Features
Walker's keynote set the tone for the vendor community. The conference
format itself is designed to simulate real factory conditions — incomplete
data, delayed responses, messy reality — and vendors are expected to
demonstrate that they can solve problems under those conditions, not just
present polished slides.
His message to vendors was blunt: focus on end-user problems, not marketing.
His message to attendees: judge vendors not by what they demo on stage, but
by whether they can operate in your chaos.
On AI specifically, Walker was deliberately optimistic — a counter to
fear-based narratives: "Humanity is going to win AI. I am absolutely not a
doomsayer."
The Kepware Disruption: A Live Risk for Manufacturers
The most operationally urgent message of the conference was about Keyware.
With PTC's acquisition dynamics shifting, manufacturing connectivity costs
tied to Kepware are expected to increase significantly — potentially
doubling for some customers.
Walker's practical advice: document your current Kepware exposure, develop a
migration plan, and calculate conversion costs before you're forced to.
Their positioning as an edge-first data platform — "connecting OT, IT, and
AI in weeks, not months" — directly targets this gap.
"You cannot digitally transform without connect. It's impossible — it's
where it starts."
This is worth watching closely. Kepware dependency is quietly embedded in
hundreds of industrial software stacks, and most organizations haven't
modeled the cost of replacing it.
The Real Problem Isn't Data — It's Decision Latency
Jeff Winter's keynote on Day 1 led with a scenario most manufacturers will
recognize: three teams watching three related signals in isolation, each
rationally deciding to wait, until a line goes down and costs $268,000 in
three hours. No villains. Just systems forcing good people into bad
coordination.
His central argument: manufacturing generates more data than any other
industry — nearly double the next highest sector — yet IDC estimates only 3%
of enterprise data is ever analyzed. 90% of IoT data is never acted on at
all.
"The tragedy is not data scarcity, it's data invisibility."
The result is decision latency — the gap between when a problem becomes
detectable and when a coordinated response actually happens. Closing that
gap is the real business case for industrial AI.
Ignition 8.3: The Composable Factory Platform
Inductive Automation's Ignition 8.3 was the flagship product release at the
conference. The headline features:
- Composable architecture — platform configuration now handled through text
files, enabling version control and DevOps-style lifecycle management - MCP module — early access release allowing LLMs to integrate directly into
Ignition, enabling engineers to use AI co-pilots and automate routine
workflows via natural language - Full support for OPC UA, MQTT, Sparkplug B, and SESAME i3x
The MCP integration is significant. It means engineers can now query and
control factory systems through Ignition using natural language. The
"agentic factory floor" is no longer theoretical — it's shipping.
What AI Can (and Can't) Do on the Factory Floor
Every session touched on AI, and the honest consensus was consistent: it's
genuinely useful, but not in the ways the hype suggests.
What's working:
- FlowFuse reported a 250% increase in development speed in a single week
using AI-assisted Node-RED flow building - Fuuz demonstrated that AI analyzed pump jack pressure patterns better than
software that had been in use for 20 years - Tulip's no-code platform now lets quality engineers build apps by
describing them in plain language — no developer required - TDengine is shipping built-in AI agents that auto-generate dashboards and
reports from time-series data
Where humans are still essential:
- Validating AI-generated code and outputs (the 80/20 problem — gets it
mostly right, breaks at the edges) - Anything requiring empirical certainty — sensor physics, process
chemistry, safety decisions - Contextual judgment under ambiguous or novel conditions
"LLMs are language reasoning tools. They are not empirical. They cannot
extrapolate. They can do some interpolation with the right rules."
The pattern across every session: AI as accelerator, not replacement. The
risk is over-trusting outputs in high-stakes manufacturing contexts without
human validation loops in place.
The Execution Gap: Why Data Alone Doesn't Stop Downtime
MachineMetrics and MaintainX both addressed the same structural problem —
and it's one of the most underappreciated gaps in industrial digital
transformation.
MaintainX cited a striking stat: 78% of manufacturers have some level of
automation, yet 68% reported the same or more downtime last year despite
those investments. The problem isn't lack of data. It's that data doesn't
automatically trigger the right human action.
"The link is missing. That's why your data doesn't stop downtime."
MaintainX's pitch is to be the work execution layer for the Unified
Namespace — translating OT signals into maintenance work orders, connecting
machine health data with tribal knowledge held by technicians.
MachineMetrics approaches the same gap from the analytics side: AI-generated
shift summaries, automatic work instruction creation during changeovers,
and integrated scheduling — all at roughly $50,000/year for a small plant.
The insight here is architectural: closing the loop from sensor to human
action requires a dedicated execution layer, not just better dashboards.
Open Standards vs. Consolidation Risk
ThredCloud's Bob van der Kuilen put the ecosystem risk plainly:
"The danger is you can easily get bought. Prices go up. Open standards
become closed standards. Open, transparent things become black boxes."
This landed differently in the context of the Kepware discussion. The
conference's general ethos was strongly pro-open-standards — partly
commercial positioning against PTC and Siemens lock-in, partly principled
stance about how industrial ecosystems should evolve.
The protocol stack the community is converging on: OPC UA + MQTT + Sparkplug
B + CESMII i3x. Inductive Automation supports all of them natively. Dados
announced a new MTT protocol capable of handling 3 million messages every 5
milliseconds, translating industrial messages into graph tables that LLMs
can query directly.
Open architecture isn't just a preference anymore — it's becoming a
strategic moat for vendors and a risk-management requirement for
manufacturers.
Five Things Worth Writing About
- The Kepware story is underreported. It's live enterprise risk for
hundreds of manufacturers right now, and most haven't modeled their
exposure. - MCP is becoming the industrial integration standard. Both Ignition 8.3
and Fuuz are shipping it already. The agentic factory thesis is
materializing ahead of schedule. - The execution gap is the real ROI. Not more sensors or dashboards — the
value is in closing the loop from data to human action. MaintainX and
MachineMetrics are building exactly this. - AI validation is an under-addressed product design problem. Every session
acknowledged it. Nobody has fully solved it. There's an article — maybe a
product — waiting in that gap. - ProveIt! is building something rare: vendor accountability culture.
Walker's model of forcing vendors to demonstrate solutions under realistic
factory conditions, not trade-show polish, is worth a standalone piece.
Coverage from ProveIt! 2026 — Dallas, TX, February 18–19, 2026. Finocchiaro
Consulting.