Colleges, corporations, and the credentialing cataclysm
How AI is razing higher ed’s already fractionating edifice…and making way for a whole new “co-intelligent” species
Forget about robots snatching away your prized white collar job.
That’s yesterday’s anxiety, the sort of AI panic you see recycled endlessly on cable news by consultants hawking their “future of work” Substack scribblings.
The real story, that is, the one that should be keeping university presidents, accreditation paper-pushers, and admissions directors up at night gulping Ambien like Tic-Tacs, is far more significant – and ominous.
Artificial intelligence isn’t primarily threatening jobs. It’s in the process of razing the entire higher education ecosystem that for generations has served as the chief gatekeeper and credentialing apparatus for professional employment.
In other words, AI is poised to do to colleges and universities what Hurricane Katrina did to New Orleans.
Or the asteroid to the dinosaurs.
Let me explain, starting with what the data itself shows.
According to Microsoft’s 2025 analysis, approximately five million white-collar positions - management analysts, customer service representatives, sales engineers - face extinction as AI creates what some economists are calling “the greatest deflationary force in human history.”
Ford CEO Jim Farley warned that AI “will replace literally half of all white-collar workers.” Salesforce CEO Marc Benioff claimed it’s already bearing up to 50 percent of his company’s workload. JPMorgan and Goldman Sachs are harnessing AI to employ even fewer than they have currently.
This may or may not be overheated speculative futurism.
Barton Swaim, writing in the Wall Street Journal, compares AI alarmism to erstwhile climate change alarmism.
A confederation of specialists—climate scientists in one version, Silicon Valley geniuses in another—joins with liberal politicians and nonprofit heads to warn of an impending catastrophe. The only moral response to this new situation, these Olympians tell us, is to transfer authority over large parts of the economy to people like themselves. That they would favor such a transfer under any circumstances, with or without a coming disaster, doesn’t bother the mainstream press, which reports their predictions with credulity and fervor. Meanwhile ordinary people, lacking the specialized knowledge to draw their own conclusions, feel cowed into going along with it all.
At the same time, the Journal itself reports that America’s largest employers are paring white-collar positions “at an alarming rate.” Data from Challenger, Gray & Christmas show that U.S. employers announced 696,396 layoffs in just the first five months of 2025 - an 80 percent jump from the previous year.
But here’s the punchline, which our entire class of punchdrunk punditry seems not to get.
The problem isn’t that AI is going to plunder people’s livelihoods. The problem is that the entire machinery of higher education, which is designed to prepare people for those livelihoods, has become its own kind of dinosaur, and AI is the asteroid that’s about to strike and ensure its speedy extinction.
Let’s take a look at what the prevailing business model of higher education has evolved into over the previous half-century.
Universities are not predominantly in the business of educating students as they are churning out credentials – specimens of parchment that signal to employers that so-and-so sat through so-many semester hours of classes and coursework, all the while amassing sufficient debt to testify to their undying commitment to a career.
The business model was not developed by universities. It was forged and fine-tuned over the decades by employers themselves and their political enablers who pushed higher education in that direction from the outset.
As a recent analysis from the journal Frontiers in Education journal makes evident, three pillars have long sustained higher education - information transmission through lectures, standardized assessments as demonstration of mastery, and degrees as their monopoly on the legitimation that comes through credentialing.
Each of these pillars is now destabilized by AI.
Consider the most adamantine of these three pillars - information transmission. AI tutors and multimodal platforms have made one-way teaching “rapidly redundant.” Students in certain Middle Eastern countries even report that they prefer AI-driven simulations over passive notetaking.
Standardized assessment? Students openly exploit AI to bypass multiple-choice quizzes and formulaic problem sets. As one professor from Arizona State University has warned bluntly: “if students learn how to use AI to complete assignments and faculty use AI to design courses, assignments, and grade student work, then what is the value of higher education?”
Credentialing monopolies? According to Samar Ahmed, commenting on the sea change currently underway in the United Arab Emirates (UAE), employers “increasingly recognize modular certifications” and “blockchain-based verification systems”, whereas universities “experiment cautiously” with competency-based recognition.
Translation? The seemingly impregnable walls of Fortress Academia have been breached on all sides.
Yet the professoriate – and of course the administrators, the overseers, the recruitment consultants, and so on continue to pretend that if instructors just tweak their syllabi to incorporate a few AI ethics or supervised research modules, or new brand strategies of adding AI willy-nilly to the curriculum, post-secondary learning will have somehow “adapted” to the brave new world we find ourselves in.
It won’t have.
American higher education is thoroughly and fatefully configured around scarcity - scarcity of knowledge, scarcity of access, scarcity of credentials.
Universities burgeoned and flourished for a long while because they controlled privileged access to those scarce resources. Scarcity is what guarantees elevated price points. The overwhelming majority of Americans have been tethered to the conviction since days of yore that if you want succeed financially over a lifetime, you go to college, because then you will have something others in varying measure do not possess.
AI liquidates these scarcities virtually overnight.
As Luke Lango writing in InvestorPlace quips, “when a company can replace a $120,000-a-year manager with a $20-a-month AI subscription, it’s not a choice; it’s fiduciary duty”.
Americans have already lost confidence on higher education, and will not regain it any time soon. An NBC News poll from November 2025 found that perceptions of the value of a college degree are “plummeting.” Only 34 percent of Americans now believe colleges have a salutary effect on the country, down from 57 percent a decade ago.
Employer trust in higher education has decreased for five consecutive years. With a few exceptions the professional “career pathways” majors to which students stampeded for generations are now becoming instantly obsolescent before our gaze, as AI algorithms supplant the very jobs they were carefully crafted to train people for.
Furthermore, the accreditation system, which was designed to guarantee that higher ed offer real value to students and of course to ensure that federal student loans were not going to prop up predatory proprietary schools, is rapidly losing credibility. It has become a Hieronymus Boschian hellscape of bureaucratic self-dealing and conniving camouflaged as quality control.
As the Century Foundation notes, “the public trusts accreditors to be both effective and consistent in their decision-making, but in reality, weak accreditors have permitted many underperforming institutions and programs to slip through the cracks.”
But here’s the plot twist that the apocalypse mongers fail to follow.
The presumed cataclysmic impact on higher education is not necessarily going to leave post-secondary learning in smoking ruins.
Quite the opposite.
What the emerging literature is showing - and what Wharton professor Ethan Mollick has articulated better than anyone else - is that effective and productive management of AI demands precisely the very same kinds of skill sets that traditional liberal arts education has always cultivated.
These are what might be considered the “holy trinity” of genuine AI competency, not just literacy. They come down to critical thinking. contextual reasoning, and ethical judgment.
Moreover, they consist in the capacity to ask the right questions rather than simply concede the first plausible answer.
The German philosopher Martin Heidegger drove home this point with his famous but cryptic line: “questioning is the piety of thinking”.
Mollick’s concept of “co-intelligence” - the collaborative partnership between human and artificial intelligence – does not scotch the necessity of human cognition. It bolsters the premium on what are undoubtedly human capabilities.
As Mollick explains, “the best way to work with [AI] is to treat it like a person…at the same time, you have to remember you are dealing with a software process.” To navigate through the shoals of sophisticated AI depends on the capacity for sophisticated critical judgment.
Of course, that is what higher education could be teaching, if it weren’t so busy defending its credentialing racket.
According to an August 2025 study by the American Academy of Colleges and Universities, 93 percent of hiring managers rate “written and oral communications, critical thinking and ethical judgment/decision-making” as the most important abilities they are looking for in recent college graduates.
Let’s ponder that. Not coding. Not data science.
Critical thinking!
McKinsey projects that by 2030 the demand for social and emotional intelligence in the United States will rise by 14 percent, as employers seek graduates who can “think critically and bring a human touch to complex challenges.”
The 2023 Workplace Learning Report from LinkedIn confirms that management, communication, leadership, research and analysis remain among the most in-demand competencies across industries. “These are precisely the abilities that liberal arts education develops.”
Even tech executives have climbed on the bandwagon.
Many CEOs at major technology companies - the very people building AI systems - have degrees in history, literature, or social sciences rather than technical fields. Their liberal arts backgrounds have afforded them “the tools to guide teams, anticipate social consequences and make decisions that reached well beyond the boundaries of technical expertise.”
The AI revolution will inaugurate, as one education researcher puts it, “a real renaissance for the humanities.”
Why?
Because AI cannot really think. It can only respond.
AI can generate complex answers, but it cannot assess their truthfulness. AI can obey ethical rules, but it remains bereft of values or intentions.
AI can mimick empathy, but it has no lived experience. It can synthesize information, but it cannot determine which problems are worth solving to begin with.
As a recent academic study that examined 640 documents on AI concluded, “critical thinking is conceptualized as a purposeful, evaluative, and self-regulated process that must be preserved despite increasing reliance on AI tools.”
The study identified key challenges including “uncritical dependence on AI, digital literacy disparities, lack of system transparency.” Its solution was both surprising and unsurprising: “inclusive and adaptive instructional frameworks that integrate AI in ways that support critical thinking.”
This is where liberal arts education, done properly and not weighted down with “woke” advocacy, becomes indispensable.
Courses that teach students to critique AI “outputs” and “strengthen their own reasoning and critical thinking skills” are the wave of the future, according to an article about “AI in the Core” at Gonzaga University. Students realize that “AI technologies don’t ‘think’ the way we do and that they don’t have a serious concern for the truth.”
Gonzaga faculty stress that “students need to know when and how to use AI critically and responsibly, how to challenge its destruction of our environment, and how to interrogate and transform its flaws.”
The irony for credential-obsessed administrators and HR managers is that those institutions that will survive the AI transformation are precisely those which have the courage to abandon the pretense of job training and return to their original mission - teaching young minds how to think.
They will not long focus on how to memorize information that can be “googled”, or how to perform tasks that machines can do, but how to teach their charges the talents for exercising sound judgment. It will simply become a matter of teaching how to distinguish truth from automated or meme-themed fabrications.
It will come down how to ask questions that AI would, or could, never think to pose.
As David Meerman Scott observes, “AI can summarize data, but it can’t yet decide which problems are worth solving. It can write code, but it can’t imagine entirely new categories of products and services. It can analyze financial markets, but it doesn’t understand the human motivations behind economic decisions.”
Universities that figure this out, and that pivot from credentialing to curating co-intelligence — will not only survive but thrive.
The dinosaurs munching on their swamp grass and watching the asteroid descend have a choice - evolve or perish.
Some will figure out they’re not actually dinosaurs at all.
They’re the new species in “academe” waiting to inherit the earth.


