Companies Rehire Workers After Failed AI Replacement

As we dive into the evolving landscape of AI in the workplace, I’m thrilled to sit down with Ling-Yi Tsai, a seasoned HRTech expert with decades of experience helping organizations navigate change through technology. With a deep focus on HR analytics and the integration of tech in recruitment, onboarding, and talent management, Ling-Yi offers unparalleled insights into how companies are balancing AI adoption with human talent. Today, we’ll explore the surprising trend of rehiring workers replaced by AI, the strategic missteps behind these decisions, and the broader implications for leaders and employees alike.

Can you share your perspective on the recent trend of companies rehiring workers they previously replaced with AI, as highlighted in Visier’s 2025 report?

Absolutely. The data from Visier’s 2025 report is quite telling—it shows a noticeable uptick in rehiring, with about 5.3% of terminated employees being brought back by their former employers, a rate higher than we’ve seen since 2018. This suggests that many companies are reconsidering their initial decisions to replace human workers with AI. It’s a sign that the rush to automate may not have delivered the expected outcomes, and organizations are recognizing the unique value that human employees bring to certain roles.

What do you believe is driving this shift back toward rehiring human workers after investing in AI?

I think it comes down to a realization that AI, while powerful, isn’t a one-size-fits-all solution. Many companies jumped on the AI bandwagon expecting instant cost savings and efficiency, but they’re finding that the technology can’t fully replicate the creativity, adaptability, and nuanced decision-making of humans in many roles. There’s also likely a gap in how these tools were implemented—without a clear strategy, AI can underperform, leaving companies to fall back on the talent they let go.

Visier’s research mentions ‘strategic or planning gaps’ in organizations adopting AI. Can you unpack what that means and why it’s happening?

Sure. Strategic or planning gaps refer to a lack of foresight in how AI is integrated into a company’s operations. Many organizations adopt AI without fully understanding its capabilities and limitations, or without aligning it with their broader business goals. They might automate roles without considering the downstream effects on team dynamics or customer experience. This often happens because there’s a rush to keep up with industry trends, rather than a thoughtful plan to ensure AI complements human workers rather than replaces them outright.

How do you think senior executives’ grasp of AI influences their decisions to replace workers with technology?

It’s a huge factor. Many senior leaders haven’t had the time or opportunity to deeply understand what AI can and can’t do. There’s a lot of hype around the technology, and without a solid grounding, executives might overestimate its potential to cut costs or boost productivity. This can lead to premature layoffs or over-reliance on AI for tasks that still require a human touch. It’s not just about knowing the tech—it’s about understanding how it fits into the organization’s unique context.

Visier suggests leaders should ask critical questions before replacing roles with AI. Can you walk us through some of those key considerations?

Definitely. First, leaders need to identify which roles can realistically be automated—tasks that are repetitive and rule-based are often good candidates, but anything requiring emotional intelligence or complex problem-solving might not be. Then, they should assess the cost and risk of implementing AI, including training the system and potential disruptions. Finally, there needs to be a plan for the remaining workforce—how will their skills be utilized, and how can they be supported through the transition? It’s about striking a balance between tech and talent.

There’s a point made about layoffs not being free, even if they save on salaries. Can you elaborate on the hidden costs of replacing workers with AI?

Oh, absolutely. While cutting payroll might seem like an immediate win, the costs of integrating AI can add up quickly. Training AI systems to handle specific tasks often requires significant time, resources, and expertise—sometimes more than anticipated. There’s also the frustration factor for staff who aren’t used to working with these tools and may struggle with effective use. Beyond that, you lose institutional knowledge and versatility that human workers bring, which can impact innovation and adaptability in ways that are hard to quantify until it’s too late.

What’s your forecast for how AI and human talent will coexist in the workplace over the next few years?

I believe we’re heading toward a hybrid model where AI and human talent complement each other more intentionally. As companies learn from these early missteps, I expect to see more focus on using AI for repetitive, data-heavy tasks while preserving human roles for creativity, strategy, and relationship-building. The key will be education—equipping leaders and employees alike with the skills to work alongside AI. If done right, this could lead to more fulfilling roles for workers and smarter, more sustainable growth for organizations.

Explore more

AI and Generative AI Transform Global Corporate Banking

The high-stakes world of global corporate finance has finally severed its ties to the sluggish, paper-heavy traditions of the past, replacing the clatter of manual data entry with the silent, lightning-fast processing of neural networks. While the industry once viewed artificial intelligence as a speculative luxury confined to the periphery of experimental “innovation labs,” it has now matured into the

Is Auditability the New Standard for Agentic AI in Finance?

The days when a financial analyst could be mesmerized by a chatbot simply generating a coherent market summary have vanished, replaced by a rigorous demand for structural transparency. As financial institutions pivot from experimental generative models to autonomous agents capable of managing liquidity and executing trades, the “wow factor” has been eclipsed by the cold reality of production-grade requirements. In

How to Bridge the Execution Gap in Customer Experience

The modern enterprise often functions like a sophisticated supercomputer that possesses every piece of relevant information about a customer yet remains fundamentally incapable of addressing a simple inquiry without requiring the individual to repeat their identity multiple times across different departments. This jarring reality highlights a systemic failure known as the execution gap—a void where multi-million dollar investments in marketing

Trend Analysis: AI Driven DevSecOps Orchestration

The velocity of software production has reached a point where human intervention is no longer the primary driver of development, but rather the most significant bottleneck in the security lifecycle. As generative tools produce massive volumes of functional code in seconds, the traditional manual review process has effectively crumbled under the weight of machine-generated output. This shift has created a

Navigating Kubernetes Complexity With FinOps and DevOps Culture

The rapid transition from static virtual machine environments to the fluid, containerized architecture of Kubernetes has effectively rewritten the rules of modern infrastructure management. While this shift has empowered engineering teams to deploy at an unprecedented velocity, it has simultaneously introduced a layer of financial complexity that traditional billing models are ill-equipped to handle. As organizations navigate the current landscape,