For the past two years, the dominant narrative around artificial intelligence has been remarkably consistent: AI will replace jobs. Recent headlines have reinforced that idea over and over again, with companies announcing layoffs in the same breath as billion-dollar AI investments. The story writes itself easily.

However, the reality I am watching unfold inside organizations looks very different from that headline.

AI is absolutely automating work. That part is true. What the narrative gets wrong is the assumption that automating work and eliminating the need for people are the same thing. In many cases, they are not. What AI is actually doing is shifting where work happens inside an organization. Tasks that humans used to perform are now generated by AI, but that output still requires review, correction, governance, and operational coordination. AI is solving some problems while quietly creating new ones downstream.

This dynamic is especially visible in SaaS companies, particularly within the teams responsible for customer enablement, customer success, and product adoption. And having spent years working in and around those teams, the pattern is hard to miss.

The Original Promise Was Real, But Incomplete

The early promise of generative AI tools was centered on productivity. AI writes content. It answers support tickets. It generates documentation. It analyzes data and automates workflows. If AI could handle those tasks, the thinking went, organizations would need fewer people to do them.

To some degree, that prediction has proven accurate. AI has dramatically reduced the time required to produce first drafts of content, code, and analysis. I have seen teams accomplish in an afternoon what used to take a week. This here is what many organizations underestimated, generating work faster does not mean the work is finished faster. Often it just moves the effort somewhere else in the process.

We would see this a lot in workflow automation. Teams look to workflow tools to automate a process but never optimize it first. They rush to the end, and then realize all they have done is increase the volume of (potentially bad) output.

AI Is Compressing Production While Expanding Coordination

One of the clearest patterns emerging from organizations actively adopting AI is simple: production time goes down, but coordination and oversight go up.

Consider a typical SaaS enablement pipeline. Before AI, a content team writes documentation and onboarding guides, product marketing reviews it, enablement converts it into training assets, and customer teams deliver it to customers. With AI in place, that first step becomes dramatically faster. Teams can generate large volumes of documentation and training outlines almost instantly.

However, this acceleration creates new problems. Documentation may contain inaccuracies or feature hallucinations (such as when a sales rep over-promises a feature that is a bullet point on a roadmap). AI-generated workflows may not match the actual product experience. Training materials may lack the nuance required for real customer scenarios (all of my fellow SEs know the tricks that just make it work). As a result, downstream teams spend more time reviewing, correcting, and adapting the content than they did before.

The work has not disappeared. It has simply moved.

Customer-facing Teams Are Experiencing This Shift (Particularly) Acutely

Customer-facing teams are experiencing this shift in a particularly sharp way. AI-powered support tools embedded in platforms like Zendesk or Intercom can now resolve many common issues automatically. Password resets, basic product questions, and routine troubleshooting. On the surface, this appears to be a major efficiency gain, but the problem is what those tools leave behind.

The tickets that remain after AI filters out the easy ones are quite different from those handled by AI. Customer success teams are increasingly dealing with complex product configuration issues, multi-system integrations, and adoption challenges that span entire business units. They are also dealing with frustrated customers whose experience with automated support failed them before a human ever got involved.

AI is filtering out the easiest problems and leaving human teams responsible for the most challenging ones. The number of tickets may decrease, but the difficulty and time required per ticket increase significantly. If you are using raw ticket volume to measure team capacity, you are already measuring the wrong thing.

The Content Flood Is A Real Issue

Another downstream effect of AI adoption that does not receive enough attention is content saturation. AI tools make it trivial to generate onboarding guides, feature walkthroughs, help center articles, knowledge base entries, and customer training scripts. This sounds like a gift for enablement and product teams. In practice, it often creates a new problem: suddenly there are hundreds of pieces of content that need to be validated against the product roadmap, aligned with real customer use cases, structured within knowledge systems, and maintained as the product evolves. The last one is the biggest of them all.

Without deliberate governance, AI-generated knowledge bases become cluttered and inconsistent quickly. Ironically, the tools designed to make knowledge more accessible can make it harder to find the right information when you actually need it.

I have seen this happen. A team generates a year’s worth of documentation in a quarter and then spends the next two quarters trying to figure out what is accurate, what is outdated, and what never should have been published in the first place.

AI Creates Work Even While It Saves Work

One of the more surprising outcomes I keep hearing from people inside organizations that have adopted AI aggressively is that workloads have not necessarily decreased. In some cases, they have increased.

A few reasons contribute to this. First, AI raises expectations around productivity. If content can be generated quickly, organizations often expect more output overall, not the same output with less effort. Second, AI introduces new layers of operational complexity. Someone has to maintain prompts, review outputs, and correct errors. That work falls to someone. Third, automation can quietly remove institutional knowledge from workflows. When experienced employees are replaced by automated systems, the remaining team members absorb that lost expertise. The system keeps running, but the people left behind carry more weight.

The result is a paradox where companies reduce headcount while the remaining employees manage larger volumes of AI-generated work. The efficiency gains are real in some places and invisible in others.

The Roles That Are Actually Emerging

What I find genuinely interesting is not just which roles AI are changing, but which roles it is creating. Rather than eliminating positions entirely, many companies are hiring people specifically to manage AI systems. Reviewing AI-generated customer responses. Validating documentation. Maintaining prompt libraries and automation workflows. Monitoring outputs for compliance and brand alignment. In many cases, these responsibilities land on junior employees or operational specialists, while more experienced team members focus on strategy and customer engagement.

The organizational model is shifting toward a hybrid structure where AI produces work, and humans supervise the system producing it. That is a real change in how teams are built and what roles look like, but it is not the “humans replaced by robots” story that tends to dominate the conversation.

What This Actually Means Going Forward

I do not think these dynamics will reverse anytime soon. If anything, they accelerate as AI tools continue improving and organizations build more infrastructure around them.

What I believe becomes increasingly valuable is the work AI genuinely cannot do: understanding customer context, building operational frameworks that hold up in the real world, identifying the gaps between how software is designed and how customers actually use it, and navigating the organizational dynamics that determine whether any adoption initiative succeeds or fails.

The teams responsible for customer enablement, customer success, and product adoption sit in the middle of all those challenges. AI may change how those teams work, and in many ways, it already has, but the need for the function does not go away. If anything, the volume of AI-generated work flowing through organizations makes the people who can structure, govern, and make sense of that work more valuable, not less.

The future of AI in the workplace may not be about replacing humans at all. It may be about making clear what kind of work humans are uniquely suited to do and then finding out whether organizations are willing to invest in it.

Leave a comment