AI turned job applications into a losing game for everyone

Hiring has always been imperfect, but it was at least personal. A resume hinted at experience. A cover letter revealed intent. An interview, however flawed, allowed two humans to size each other up.
That fragile social contract is now eroding.
As artificial intelligence moves deeper into hiring, the process of finding work in America is becoming faster, colder, and paradoxically less effective. Employers complain they cannot identify good candidates. Job seekers feel reduced to data points competing against algorithms. Both are right.
AI hiring was supposed to fix inefficiency. Instead, it is producing a system that exhausts everyone involved.
A Labor Market Flooded by Automation
More than half of US organizations used AI tools in recruitment in 2025, according to the Society for Human Resource Management. At the same time, roughly one-third of job seekers reportedly used ChatGPT or similar tools to generate resumes, cover letters, and application materials.
The result is not better matching, it is application inflation.
AI allows candidates to apply to hundreds of jobs in minutes. Employers, overwhelmed by volume, respond by automating screening and even interviews. Each side escalates automation to cope with the other. What emerges is a feedback loop that rewards neither effort nor authenticity.
Daniel Chait, CEO of recruiting software firm Greenhouse, calls it a “doom loop.” Everyone feels the process is getting worse, yet no one feels able to stop.
When Cover Letters Stop Meaning Anything
Recent academic research underscores the problem. Anaïs Galdin of Dartmouth and Jesse Silbert of Princeton analyzed tens of thousands of job applications on Freelancer.com. After the introduction of ChatGPT in late 2022, cover letters became longer, clearer, and more polished.
They also became meaningless.
Because nearly everyone suddenly sounded competent, employers began discounting cover letters altogether. Hiring rates fell. Starting wages declined. Information that once helped differentiate candidates lost its signal value.
“The ability to select the best worker today may be worse due to AI,” Galdin observed.
This is a classic economic problem: when a signal becomes cheap, it stops working. AI didn’t make applicants better, it made them indistinguishable.
Interviews Without Humans
To cope with volume, many employers now automate the interview itself. More than half of U.S. job seekers surveyed by Greenhouse said they had participated in an AI-led interview, often asynchronous, recorded, and scored by algorithms.
These systems promise consistency, but consistency is not neutrality.
Algorithms learn from historical data, and history is biased. Researchers warn that automated interviews can amplify discrimination based on speech patterns, facial expressions, accents, disabilities, or cultural differences. What looks like objectivity can be bias at scale.
“Algorithms can copy and even magnify human biases,” said Djurre Holtrop, who studies AI-driven hiring systems.
The danger is subtle: biased outcomes without biased intent.
The Human Cost of a Machine-Mediated Process
For job seekers, the experience is increasingly alienating.
Jared Looper, an IT project manager in Utah, describes his AI interview as “cold.” He hung up the first time, assuming it was a scam. He now worries about candidates who lack the time, technical literacy, or confidence to optimize themselves for algorithmic judgment.
“Some great people are going to be left behind,” he says.
That concern is not hypothetical. Labor groups argue AI hiring systems disproportionately harm workers from marginalized communities. Liz Shuler, president of the AFL-CIO, calls their use in hiring “unacceptable,” warning that qualified workers are filtered out for arbitrary reasons, names, zip codes, or behavioral proxies unrelated to job performance.
Hiring has always been exclusionary. AI risks making that exclusion invisible.
Regulation Arrives, Unevenly
States are beginning to respond. California, Colorado, and Illinois have enacted rules governing AI use in hiring, focusing on transparency, consent, and bias mitigation. Lawsuits are already emerging.
In one high-profile case backed by the ACLU, a deaf woman is suing HireVue, alleging its automated interview system failed to meet accessibility standards. HireVue denies the claim, saying its tools are grounded in behavioral science and designed to reduce bias.
At the federal level, however, the regulatory picture is murky. A recent executive order signed by President Donald Trump threatens to weaken state-level AI oversight, adding uncertainty for employers and workers alike.
Existing anti-discrimination laws still apply but enforcement struggles to keep pace with technical complexity.
Why Companies Aren’t Winning Either
Employers are not villains in this story. They are overwhelmed.
The market for recruiting technology is projected to reach $3.1 billion this year, driven by promises of efficiency. But many companies quietly admit the tools are not delivering better hires just faster rejection.
When everyone automates, no one gains advantage. Hiring managers still struggle to assess motivation, adaptability, and cultural fit, the qualities that matter most after onboarding.
AI excels at pattern recognition. Hiring, at its best, is about judgment.
What AI Should and Shouldn’t Do in Hiring
AI is not inherently harmful to recruitment. Used carefully, it can surface overlooked talent, reduce clerical work, and help candidates navigate opaque systems.
The problem is not AI itself. It is substitution without redesign.
When automation replaces human judgment instead of supporting it, hiring becomes brittle. When speed becomes the dominant metric, trust collapses.
The future of hiring should not be humans versus machines but humans with machines, operating under clear rules, accountability, and restraint.
Rebuilding the Hiring Social Contract
Hiring is not merely a transaction. It is how societies allocate opportunity.
If AI continues to dominate recruitment without recalibration, the labor market risks becoming more opaque, less fair, and less effective, all at once.
The irony is hard to miss: in trying to remove friction, we have removed meaning.
Efficiency is valuable. But without empathy, transparency, and accountability, it is hollow. AI hiring is here to stay but unless companies rethink how and why they use it, it may cost them exactly what they are trying to find: good people.

