It happened about two months ago. Not on purpose. I didn’t make a decision or announce anything. I just realized one day that I hadn’t launched it in over a week because I’d been adding, updating, and completing tasks entirely through Claude.
Then the same thing had happened with my notes app. And I started using my email app less and less. And the SmartThings app I use to control my house became more of a dashboard. Four applications I used every single day, and I’d replaced most of their functions without really trying. How? I built a set of MCP integrations that let Claude talk to each of those systems directly, and once that worked, the apps themselves became unnecessary for a lot of basic tasks. Not obsolete in some abstract future sense. Unnecessary right now, on my actual computer, in my actual workflow.
The weird part is how unremarkable it felt. I didn’t build some complex platform. I wired up a few connections over a weekend, and suddenly I had a single interface for things that used to require four separate apps, four separate subscriptions, four separate places to look for information. And all my data stayed within my own infrastructure (think local + private cloud).
I keep thinking about that. Not the technical part, but the economics. I replaced four products with something I built myself in a couple of days. But I’m not the only one thinking this way. A Retool survey from late 2025 found that 35% of companies have already replaced at least one SaaS tool with something they built themselves. At AppDirect, a couple of marketers (not engineers) built 11 different internal tools using AI and saved over $120,000 in software costs. These aren’t edge cases anymore.
The Stock Market Is Figuring It Out
Wall Street tends to be late to the party, but when they finally show up they don’t mess around. In January 2026, the S&P North American software index posted its biggest monthly decline since October 2008. The software ETF (IGV) dropped around 25% from its September 2025 peak. Traders started calling it the “SaaSpocalypse,” which is dramatic but not entirely wrong.
Salesforce is down 40% over the past twelve months. ServiceNow dropped 50% from its high. Adobe is off 35%. HubSpot fell 51% in 2025. These are not speculative startups. These are the companies that defined enterprise software for the past decade.
Part of the trigger was Anthropic’s Claude Cowork launch in January 2026, which showed non-technical workers offloading enterprise-grade tasks to AI. The result was nearly a trillion dollars in losses across global software and professional services stocks. But the trigger isn’t really the point. The market was pricing in something people on the ground had already been feeling: SaaS costs too much, does too little of what you actually need, and keeps getting more expensive.
How much more expensive? SaaS inflation hit 12.2% in 2026, which is roughly five times the general inflation rate. The average SaaS cost per employee went from $7,900 in 2023 to $9,100 in 2025. And vendors kept raising prices while the alternative was getting cheaper by the month.
Even the people building SaaS see the shift coming. Satya Nadella said on a podcast that business applications are “essentially CRUD databases with a bunch of business logic. The business logic is all going to these agents.” He wasn’t being pessimistic. He was being honest about where the industry is headed, and Microsoft is positioning accordingly. But when the CEO of one of the largest enterprise software companies in the world describes the category that way, it’s worth paying attention.
Building Software Just Got Ridiculously Cheap
So what changed? The short answer is that AI made building software dramatically faster and cheaper, and the tools got good enough that you don’t need to be a developer to use them.
Forty-one percent of all code written globally in 2025 was AI-generated. That number would have been unthinkable three years ago. Andrej Karpathy (co-founder of OpenAI, former AI lead at Tesla) coined the term “vibe coding” in February 2025, describing a new way of building software where “you fully give in to the vibes, embrace exponentials, and forget that the code even exists.” The post got 4.5 million views because it described something a lot of people were already doing but didn’t have a name for.
The YC Winter 2025 batch was telling: a quarter of the startups had codebases that were 95% AI-generated. These weren’t people who couldn’t code. They were highly technical founders who chose AI because it was faster. That batch turned out to be the fastest-growing and most profitable in YC’s history.
And it’s not just developers anymore. Sabrine Matos, a growth marketer with no engineering degree, built Plinq, a women’s safety app, entirely using AI tools. Result: 10,000 users in three months and $456,000 in annual recurring revenue. She didn’t hire a developer. She didn’t license a platform. She just built it.
The demand for these tools is real. Cursor went from $1 million in ARR in 2023 to $1.2 billion in 2025, making it arguably the fastest-growing SaaS company of all time (the irony of an AI coding tool being a SaaS product is not lost on me). McKinsey estimates AI reduces software development costs by 20 to 45 percent. GitHub’s own study found developers completed tasks 55% faster with Copilot. The cost of building custom software hasn’t just decreased. It’s collapsed.
This Is How Every Big Shift Starts
If this feels familiar, it should. Every major technology shift in the past two decades has followed the same pattern: it starts with individuals, spreads to small businesses, and eventually reaches the enterprise.
The iPhone did this. People brought their personal phones to work, IT departments fought it, and eventually “BYOD” became policy because the alternative was pretending it wasn’t happening. Dropbox did it. People started syncing files to the cloud because the corporate file server was terrible, and IT eventually had to formalize it. Slack did it. One team adopted it, then another, then suddenly 80% of the Fortune 100 was using it and Salesforce paid $27.7 billion to acquire it.
AI is following the same playbook. Right now, 84% of developers are using or planning to use AI tools in their work. That’s the “individuals” wave, and it’s already past the tipping point. Then you look at small businesses: 68% are using AI regularly, up from 48% just a year and a half ago. More than half of active AI users are on teams of ten or fewer. These are small teams building exactly what they need instead of licensing tools built for someone else’s workflow.
The enterprise wave is starting. Klarna terminated its contracts with Salesforce and Workday and rebuilt internally using AI. Revenue per employee went from $400,000 to $700,000. That’s not a marginal improvement. That’s a fundamentally different cost structure. And Klarna isn’t some scrappy startup. They’re a publicly traded fintech company making a calculated bet that building is cheaper than buying.
Gartner projects that 70% of new enterprise applications will use low-code or no-code tools by 2026. The friction that used to protect SaaS vendors (it was just too hard and too expensive to build your own) is disappearing.
And You Get to Keep Your Data
The cost argument is compelling, but there’s a second reason this shift is happening that might matter even more: when you build your own tools, your data stays where you want it. The question is shifting from “can we afford to build?” to “can we afford to keep handing our data to someone else?”
A Barclays CIO survey found that 86% of CIOs plan to move some workloads back from the public cloud. That’s the highest rate on record. Among mid-market organizations, 97% plan to move workloads off public clouds specifically for sovereignty reasons. This isn’t a fringe position anymore. It’s the mainstream view.
37signals, the company behind Basecamp and HEY, provides the clearest proof point. Their AWS bill hit $3.2 million a year. They spent $700,000 on Dell hardware and moved off the cloud. Result: $2 million a year in savings, with projected savings of over $10 million over five years. They also moved nearly 10 petabytes of data from S3 to on-prem storage. AWS actually waived $250,000 in egress fees to let them go, which tells you something about the data gravity problem.
Sovereign cloud spending is projected to hit $80 billion in 2026, a 35% increase from 2025. Between tightening regulations (the EU Data Act, GDPR transfer restrictions, the European Health Data Space) and plain old common sense about where your most valuable asset should live, the trend is clear. Keeping your data close isn’t paranoia. It’s strategy.
What Do the Hyperscalers Do Now?
Here’s where it gets interesting. The hyperscalers are not going away. They’re spending over $600 billion on capital expenditure in 2026, with 75% of it going to AI infrastructure. That’s a staggering bet. But the nature of what they’re providing is changing.
AWS is already adapting. They launched AI Factories, which are managed AI infrastructure deployed inside customer data centers. Google is offering Gemini on-prem through Google Distributed Cloud. Microsoft has Azure Arc for managing hybrid and multicloud environments from a single pane of glass. The hyperscalers see the shift and they’re following their customers on-prem rather than waiting for the customers to come back to the cloud.
AWS CEO Matt Garman pointed out something that gets overlooked in the “everything is moving to the cloud” narrative: more than 85% of global IT spend is still on premises. Cloud never actually won as completely as we thought it did. It captured a real share, especially among startups and digital-native companies, but the bulk of enterprise computing never left the building.
Meanwhile, a new class of cloud providers, the neoclouds, are eating into the hyperscaler advantage on the pure compute side. CoreWeave posted $5.1 billion in revenue in 2025 after being essentially a nothing company three years earlier. Their revenue backlog is $56 billion. Lambda is planning an IPO. These companies exist because AI workloads have different requirements than traditional cloud, and many of the hyperscalers were slow to adapt.
Arthur Mensch, the CEO of Mistral AI, put it plainly: “More than half of what’s currently being bought by IT in terms of SaaS is going to shift to AI.” That might be aggressive, but the direction is right. The application layer is what’s up for grabs. The infrastructure layer still has a role, but it needs to provide something you genuinely can’t build yourself. Massive GPU clusters for training models, global network reach, compliance frameworks across dozens of jurisdictions. That’s real value. But “here’s a CRM for $300 per seat per month” is not going to cut it anymore.
There’s an irony in all of this that I keep coming back to. The cloud made AI possible. These models run on exactly the kind of massive distributed compute that the hyperscalers built. And now that AI is here, it’s making the cloud’s application layer optional. The thing they built is the thing that’s unbundling them.
I don’t know exactly how this plays out. But I know that last week I asked Claude to check my calendar, draft an email, turn on my office lights, and add a note to my project file. And it did all of that without opening a single app. It used tools I built, running on my hardware, with my data. That’s the whole argument.