top of page

Is AI making humanity a premium feature?

  • Writer: Thomas Thurston
    Thomas Thurston
  • 9 hours ago
  • 9 min read
ree

A friend asked me this question recently, and I had no answer.


"What if artificial intelligence could replace university professors in 90% of college classrooms?"


I wasn't sure what I thought. Could be good, could be bad. Why did this question feel so difficult?


The economics seemed straightforward enough. Picture an AI that understands each student's learning style, pace and preferences in ways no human instructor managing thirty undergrads ever could. It could personalize every explanation, repeat itself without exhaustion, stay current with the latest research and do all of this at a fraction of the cost that's currently bankrupting American students and their families. Tuition has increased by more than 200% since 1980, adjusted for inflation.[⁷] Students now graduate with an average of $43,000 in debt.[⁶] If AI could deliver better learning outcomes at lower cost, the answer should be obvious.


It wasn't obvious to me.


What Are We Really Buying?


Let me tell you about a different kind of purchase.


In 2010, a writer named Michael Lewis published a book about the 2008 financial crisis called "The Big Short."[⁸] It told the story of a handful of investors who saw the housing collapse coming and bet against the entire American economy. The book became a phenomenon, then it became a movie.


If you bought that book, you paid something like $28.95 for what seemed like a simple transaction: one price, one product. Except it wasn't one product at all.


You got the facts about the crisis (the timeline, the mechanics of mortgage-backed securities and credit default swaps, the names and dates). You got Lewis's analysis of why it happened, his theory of the case. You got his storytelling, and he has this gift for making bond traders interesting, which is not easy to do. You got his particular sensibility, his eye for the absurd detail, his way of seeing the world.


All of this came bundled together. You couldn't buy just the facts or just the storytelling or just Lewis's perspective. The market didn't offer those options because Michael Lewis was a human being, and human beings produce all of these things at once.


Now imagine something different. Imagine you could get a clear, accurate summary of the 2008 financial crisis from an AI. All the essential facts, well organized, perfectly current, delivered in minutes, free or nearly free.


This raises a question: if you can get all the data for free elsewhere, what are you actually paying Michael Lewis for?


The Milkshake Problem


In the late 1990s, a fast food chain came to Harvard Business School professor Clay Christensen with a puzzle. They wanted to sell more milkshakes and had tried everything: making them thicker, sweeter, cheaper. Nothing worked.


Christensen's colleague sat in the restaurant for hours and watched people buy milkshakes. They noticed something odd. Most purchases happened before 8 a.m., and these customers were alone. They bought the shake and left immediately.


When the researchers interviewed these early morning customers, they discovered the milkshake wasn't competing with other milkshakes at all. It was competing with bagels, bananas and boredom. People were hiring it to make their commute less tedious. They needed something that would last the whole drive, something they could consume with one hand while navigating traffic, something more interesting than nothing.[³]


The same milkshake sold at 3 p.m. was doing a completely different job: helping a parent pacify a restless child. Same product, different purpose.


Christensen called this "jobs to be done." People don't buy products, he argued. They hire them to solve problems in specific contexts. Seen through that lens, a single product or service could be doing many jobs for you at once, and you may value each of those jobs differently.


When The Bundle Comes Apart


Now apply this lens to my friend's question about professors.


A college education is doing at least five separate jobs simultaneously. There's content delivery (learning facts, theories, formulas). There's credentialing (getting a degree that signals competence to employers). There's socialization (building networks). There's identity formation (figuring out who you are). And then there's something else, something harder to articulate.


Imagine two students. The first reads every economics textbook, completes every problem set, masters every concept entirely on their own over four years. They qualify for an economics degree. The second studies under the close guidance of a Nobel laureate economist. Daily conversations, constant feedback, four years of apprenticeship. They also earn an economics degree.


Same credential, same content mastered, yet we'd value these people differently. Why? The apprentice gained something that can't be written down: judgment about which questions matter, intuition about when theories break down, ways of thinking about problems that resist codification. The tacit knowledge that passes between humans when they work closely together over time.


Right now, these jobs are bundled together. You pay tuition and get professors, content, credential, campus experience. Everything together because the market offers no alternatives. AI changes this in a fundamental way.


AI can deliver content brilliantly, personalized to each student's pace and learning style, adapted in real time, current and comprehensive, at a fraction of the cost. That job can be unbundled. What AI cannot do, or what we sense it cannot do in the same way, is the human transmission. The ineffable thing that happens when one person teaches another person how to think.


This is why my friend's question felt impossible to answer. Replacing professors sounds efficient until you realize that professors were doing jobs you didn't know you were paying for.


Just An Old Fashioned Love Song


This raises a more fundamental question about what AI can and cannot replace. Here's a way to think about it.


You need a user manual for your car's infotainment system. Which button does what, how to pair your phone. Do you care who wrote it? Do you care whether they brought their unique human sensibility to explaining touchscreen controls? You don't. You want information, and an AI-generated manual would be fine, probably better actually.


Now imagine it's your anniversary. You're making dinner and you want to play a love song. An AI could generate something that sounds note-for-note identical to any classic you can name. The melody could be perfect, the harmonies flawless, the production indistinguishable from anything recorded by a human artist.


Would you play it?


Many people wouldn't, and the reason reveals something important. What you're hiring that love song to do isn't to deliver information about what love is or what love feels like. You don't need data about romance. What you need on your anniversary is connection to another human being's emotional experience. You need to know that someone else felt what you've felt. The ache, the longing, the joy of being seen by another person. They felt it deeply enough to write it down, to share it, to make themselves vulnerable by admitting these feelings exist. When you hear that song, you're connecting to their experience because you've had that experience too. That shared humanity is what gives the song its meaning, what makes it valuable to you and your partner on your anniversary.


The AI-generated version might be technically identical, but it's not a human experience translated into music. It's a prediction of what notes should come next based on patterns in training data. There's no felt experience behind it, no vulnerability, no human being who actually lived through something and tried to make sense of it. An AI hasn't loved someone and worried about losing them. It hasn't felt its heart race when someone walked into a room. The source isn't just part of the product. The source is the product.


This distinction matters everywhere AI touches. Sometimes you're hiring content to deliver information (the user manual). The source is irrelevant. Sometimes you're hiring content to connect you to human experience (the love song). The source is everything.


What Unbundled Education Looks Like


So here's how you might reimagine a University. Students master technical content through AI. Economic theory, mathematical proofs, historical timelines. The AI adapts to each student's needs, tests comprehension, fills gaps, never gets tired or frustrated. This costs relatively little.


Simultaneously, students pay separately for intensive human mentorship, not in content delivery but in something else entirely: rhetoric. The word sounds ancient because it is. Aristotle defined it as "the faculty of observing in any given case the available means of persuasion."[⁵] It's the deeply human skill of figuring out how to communicate ideas effectively to other humans.


How to construct an argument, how to organize ideas for different audiences, how to anticipate objections, how to choose words that move people, how to deliver them with conviction. These skills resist automation, and more importantly, they're what make knowledge useful. Understanding economic theory means nothing if you can't explain to a skeptical board why their strategy will fail, if you can't frame data to persuade someone who disagrees, if you can't communicate complex ideas to non-experts.


All those professors who currently spend their time explaining supply and demand curves (which AI now does better) could focus entirely on what machines cannot do: running seminars where students debate ideas, coaching them on presenting to hostile audiences, teaching them to think critically, argue persuasively, communicate clearly.


You'd be excellent at both jobs. Content mastery through AI's tireless personalization, human formation through intensive mentorship focused on irreducibly human skills. The total cost drops while the quality of human interaction improves because experts focus entirely on what only they can provide.


This same logic applies everywhere. Organizations could let employees master training content through AI, then pay for human coaching only when applying it to specific contexts. You'd pay for what you actually need instead of paying for a bundle that does everything adequately but nothing excellently.


The Invisible Made Visible


There's something larger happening here. For all of human history, human creation was invisible because everything was made by humans. We didn't value "human authorship" as a separate feature because there was no alternative. You couldn't choose between a human-written book and an AI-written book because there was only one kind.


AI makes human origin visible by providing a contrast. It creates a market where "made by humans" becomes a feature you can select or reject, and this is profoundly uncomfortable. It forces questions we never had to ask before. Which parts of this education require a human? Which parts of this book am I really paying for? Which parts of this job could a machine do equally well?


The discomfort is necessary because we're discovering what we truly value about human work. For the first time, we have to choose it consciously.


For this to work, though, we need transparency. If you buy music thinking a person wrote it about their lived experience, then discover an algorithm generated it, you've been deceived. You paid for one type of value and received another. Several states have started requiring AI to identify itself,[⁴] and the instinct is correct. This is truth in advertising for the age of AI. When people can distinguish the source, they can make informed choices. Some will choose AI for speed and cost, others will choose human creation for its ineffable qualities, and many will choose differently depending on context.


Finding Our Worth


The implications depend on where you sit. Universities that bundled weak teaching with strong credentials will face pressure to justify both separately. Authors who relied on having information before others will find AI-distributed information has less value than their unique perspective. Teachers who primarily delivered content will need to become mentors.


For individuals, the shift is personal. Content knowledge no longer differentiates you when AI has read everything humans ever wrote. What remains valuable are capabilities that resist codification: judgment in ambiguous situations, emotional intelligence, creative synthesis, the ability to build trust with other humans.


So when my friend asked whether AI could replace 90% of professors, the answer is both yes and no. Yes, AI can replace the content delivery part, which is maybe 90% of what happens in some lecture halls right now. No, AI cannot replace what we actually value most about human teachers, at their best. The problem isn't just that we've been paying for both things together. It's that we've been overpaying for content delivery that AI now does better and cheaper, while underpaying for human mentorship that deserves a premium when done well. The bundle forces both into a compromise. Neither gets optimized for what it should actually do.


We're about to discover what the pieces are worth separately. Bundles are coming apart, and we're learning what we're willing to pay for and why. It's an uncomfortable process, but it's also clarifying. AI isn't destroying the value of human work. It's forcing us to articulate what that value actually is.


We're about to find out.




Endnotes


[¹] After adjusting for inflation, college tuition at four-year institutions increased 197.4% between 1963 and 2024. Since 1980 specifically, tuition and fees increased over 200% when adjusted for inflation. Source: Education Data Initiative, "College Tuition Inflation [2025]: Rate Increase Statistics," September 9, 2024.


[²] The average student loan debt per borrower is approximately $42,673 as of 2025. Sources: Education Data Initiative, "Student Loan Debt Statistics [2025]: Average + Total Debt," August 8, 2025; BestColleges, "Average U.S. Student Loan Debt: 2025 Statistics," August 29, 2025.


[³] Michael Lewis, The Big Short: Inside the Doomsday Machine (New York: W. W. Norton & Company, 2010).


[⁴] Clayton Christensen's Jobs to Be Done theory, first articulated in his 2003 book The Innovator's Solution and elaborated in his 2016 Harvard Business Review article "Know Your Customers' Jobs to Be Done," posits that customers don't buy products or services; they "hire" them to accomplish specific progress in particular circumstances.


[⁵] Aristotle, Rhetoric, Book I, Part 2. Translation: W. Rhys Roberts. The Internet Classics Archive, MIT.


[⁶] California passed the AI Transparency Act in September 2024, requiring AI providers to include watermarks and make detection tools available starting in 2026. Utah enacted the first comprehensive AI disclosure law in 2024. Several other states have introduced similar legislation. Social media platforms including Meta, YouTube and TikTok have established their own AI disclosure requirements.


Sources: California AI Transparency Act (signed September 2024); Utah Artificial Intelligence Policy Act (effective May 2024); The Blacklist Alliance analysis of state AI disclosure laws, May 2025.

 
 

Subscribe to the blog

Growth Science Ventures icon

© 2025 AGrowth Science International, LLC

bottom of page