PTC: Organising The Noise
And other musings on AI
Friends of NFTBC.
This year, it seems, investors have collectively asked whether enterprise software is being rendered obsolete by new generative AI tools. It’s not a topic I’m keen to weigh in on or where I would have much of use to contribute - generally speaking. That said, there is currently one software stock that we talk about here at NFTBC - PTC. So let’s talk about it.
At present, there are two principal threats that software businesses are facing (as widely perceived). The first is from generative coding tools (“vibe coding”) - specifically, if software can now be written by anyone at low cost, why would businesses need to buy expensive enterprise applications anymore? The second is risk of displacement by “AI agents” - if an increasing proportion of traditional business functions can be carried out more efficiently by autonomous agents, won’t businesses require fewer software seats?
Notably, PTC stock doesn’t appear to be dropping as hard as many of the previously popular SaaS names. It seems abundantly clear that some species of software could be more at-risk than others and, I suspect, investors are in the process of trying to sniff out which ones these are - and rightly so. The questions involved are difficult ones to wrestle with and I wouldn’t be surprised if we’re still a good while away from greater clarity - possibly years in some cases. Equally, it seems to me that some software businesses are likely to prove quite robust or potentially emerge as beneficiaries in the wave of developments to come. I’m inclined to include PTC in this latter camp.
2026 should be an illuminating year for PTC followers as we get to hear progressively more about the AI strategy that is already well underway. The cadence of AI feature releases has been steady over the last 12+ months, with Codebeamer AI launching just last week (demo here).
I listened to a conversation with PTC’s AI Product Lead Ayora Berry in recent days, and so I thought I would share a few thoughts with those of you who follow the story. Berry was talking to one of the co-founders of Leo AI - an engineering design co-pilot that has officially partnered with PTC’s cloud CAD Onshape product. The title of this post (“Organising the Noise”) derives from the conversation. It’s certainly a bewildering time, and this makes it all the more important to present customers with a coherent strategy and roadmap - and PTC has been doing exactly that. In broad brush strokes this has entailed conveying to customers what to expect and on which approximate timeframes. Rather than overpromising and under-delivering, PTC is releasing AI tools into its existing products, as and when they are sufficiently robust and genuinely useful. As you might expect, this means simpler deliverables today, and more complex and challenging ones in the months and years to come. Simply being organised and dependable at a time like this, perhaps should not be under-estimated.
[NFTBC does not give advice - please do your own research. I currently own shares in PTC].
Just to kick things off, I don’t think there is any serious suggestion that businesses would ever seek to vibe code replacements for their CAD or PLM systems - you quickly get into a minefield of all sorts of different issues around complexity, compatibility, robustness etc. The idea is barely worth entertaining. Recall in the deep dive we looked at a long list of case studies where companies suffered negative consequences from switching CAD/PLM systems (or even versions). So rather than system replacement, the possibility of threats relates more to workflows - how users interact with the existing underlying systems and who captures the economics. For example, on the vibe side of things, there’s “prompt-to-CAD” or “vibe CADing” - where you use a (potentially third-party) interface to describe a part which is then executed by a CAD system.1 Then on the agent side, the threat is that a third party comes along and controls a new surface for interacting with the agents, while the PLM system is effectively relegated to an invisible layer and disappears from the user’s mind.
I have highlighted these risks for understanding and completeness. But it must be said that in the near- to medium-term AI looks to be highly complementary to PTC’s business and more opportunity than threat. For the medium- to longer-term PTC is starting from a particularly favourable position - I would argue AI is their game to lose even if success is yet to be secured. Let’s now start thinking about some of Ayora Berry’s recent comments (lightly edited):
it’s very clear for us: [the key to driving AI adoption is] to get your product data foundation in order, or your data house in order. In the enterprise or in any software system, AI is only as good as, the context from which it operates in. And so your data in your systems need to be quality, they need to be accessible, they need to be structured so that the AI has that substrate from which to deliver insights and automation. So in terms of being ready for AI, the clear message is that there’s good news: like you already have a good start. You’re using PDM, PLM, you’re using ALM—all these systems of records out there to manage your business processes. And you’ve been on those journeys for decades. Now you can tap into all that work you’ve been doing by embedding AI into those systems to deliver those insights and automation. So that’s foundational: is to get that product data established and continue your journey in using enterprise software.



