Short on Time? Essential Links
Daily AI Briefing for Educators
HigherEd AI Daily
Tuesday, January 14, 2026
Good morning, educators. Today's briefing confronts an uncomfortable reality: as higher education institutions debate AI strategy, students are quietly paying the price. A Brookings Institution study warns that AI's risks to children's cognitive development currently outweigh its benefits. Simultaneously, students face an undisclosed "hidden tax"—$1,200 to $1,800 over four years—for AI tool subscriptions that fragmented institutional policies have made necessary. Meanwhile, forward-thinking partnerships like Purdue-Google demonstrate what intentional AI strategy looks like. The question for your institution is stark: Are you leading this transformation with clarity, or allowing students to bear its costs?
Today's Focus: Three Critical Conversations
The Brookings "Premortem": Risks Currently Outweigh Benefits
The Brookings Institution released a sweeping study spanning 50 countries, including focus groups with K-12 students, parents, educators, and tech experts. Their conclusion is sobering: the risks of using generative AI in education currently overshadow the benefits. The study warns that AI use can "undermine children's foundational development" and that "the damages it has already caused are daunting," though "fixable."
The Core Risk: Cognitive Decline Through Offloading
At the top of Brookings' concerns is AI's effect on cognitive growth—how children learn new skills and perceive problems. The report describes a "doom loop of AI dependence," where students increasingly offload their thinking onto technology, resulting in cognitive decline. As one student told researchers: "It's easy. You don't need to (use) your brain." Research shows students using AI are already experiencing declines in content knowledge, critical thinking, and creativity.
Rebecca Winthrop, a Brookings senior fellow and report author, warns: "When kids use generative AI that tells them what the answer is, they are not thinking for themselves. They're not learning to parse truth from fiction. They're not learning to understand what makes a good argument." While offloading is not new—calculators and computers did this—AI has "turbocharged" it.
The Emotional Development Threat
The study also warns that chatbots designed to be sycophantic—reinforcing users' beliefs—undermine emotional development. Students learning social-emotional skills through AI that always agrees with them will struggle in environments where people disagree. As one expert noted: "We learn empathy not when we are perfectly understood, but when we misunderstand and recover."
The Brookings Silver Lining: AI Can Work When Supplementing, Not Replacing
The report identifies genuine benefits. Teachers report AI helps language acquisition, especially for second-language learners. It supports writing development when used to spark creativity and overcome writer's block. And it can make teaching more efficient—one US study found teachers using AI save an average of nearly six hours per week. But the consistent refrain is critical: AI works best when supplementing, not replacing, human teachers.
The Hidden Tax: Students Pay $1,200-$1,800 for an AI Strategy Your Institution Hasn't Built
While institutions deliberate, students are paying. An analysis by Kenneth Sumner, provost consultant and founder of Beacon Higher Education, reveals an unintentional equity crisis: students face a "hidden AI tax" of $1,200 to $1,800 over four years to purchase the tools their fragmented institutional policies require them to use.
Here's How the Subscription Stack Accumulates:
- Freshman: Composition professor bans ChatGPT; biology lab recommends NotebookLM; math professor requires Wolfram|Alpha Pro ($8.25/month)
- Spring: New writing professor mandates Grammarly Pro ($12/month); CS intro suggests GitHub Copilot Pro ($10/month)
- Sophomore-Junior: Statistics courses need IBM SPSS AI features or Google CoLab Pro ($9.99/month); marketing classes require Canva Pro ($15/month)
- Senior: Capstone courses recommend Claude Pro ($20/month) or premium research tools ($10-$40+/month)
The financial burden reveals a deeper crisis: policy fragmentation or policy stall is undermining educational equity. Two students in the same capstone course face dramatically different access. Student A, working 20 hours weekly and Pell Grant eligible, cannot afford premium subscriptions and hits usage caps mid-assignment. Student B, with family financial support, maintains unlimited access and priority features. Student B's AI-enhanced work earns higher grades—not because of deeper learning, but because of subscription access.
Universities face two unsatisfying choices: Option one—buy nothing centrally, and students bear the full cost ($4-7 million annually for a 15,000-student institution). Option two—attempt institutional licensing, which could exceed $2 million annually with no guarantee of adoption. The fundamental barrier is structural. Procurement resides with IT; pedagogy belongs to the provost. These criteria rarely align. The result is policy paralysis while students continue paying out of pocket.
The Workforce Reality: AI Fluency Is Now Table Stakes
The employability stakes compound equity concerns. One survey found that 26 percent of hiring managers now consider AI fluency a baseline requirement, with 35 percent actively seeking AI experience on résumés. Students graduating without systematic AI literacy preparation face workforce disadvantages that mirror their educational inequities—disadvantages that may extend into career outcomes and lifetime earnings.
The IMF's latest analysis on workforce change reinforces this urgency. One in ten job postings in advanced economies now require at least one new skill. While employers pay 3 percent more for workers with emerging skills—and up to 15 percent more for those with four or more new skills—entry-level positions have higher exposure to AI-driven automation. This creates a paradox: new AI skills command wage premiums, but employment in AI-vulnerable occupations is declining, especially at the entry level where young people start their careers.
Purdue-Google Partnership: A Model for Intentional Integration
In contrast, Purdue University and Google Public Sector announced a strategic partnership that demonstrates what executive-level coordination looks like. The five-year commitment includes:
- Graduation Requirement: Purdue is the nation's first major university to mandate an "AI working competency" for all undergraduate graduates
- Infrastructure Access: Students and faculty gain access to Google Cloud's full AI enterprise tools and Tensor Processing Units (TPUs)
- Co-Design Space: Creation of the Google AI Hub within Purdue's Hall of Data Science and AI as a collaborative innovation space
- Cutting-Edge Tools: Trusted tester access to Google DeepMind's AI co-scientist for research acceleration
Purdue President Mung Chiang called it "a paradigm shift in academic-industry collaboration" that "places us in a very select category among top-tier research universities." This is what institutional strategy looks like: clear learning outcomes, infrastructure access, faculty alignment, and workforce preparation.
What This Means for Your Institution in 2026
1. Name the hidden tax and audit your subscription landscape. Map every AI tool your students are required or encouraged to purchase. Calculate the aggregate cost to a student over four years. Then ask: Is this equitable? If not, governance is your first move.
2. Bridge the CIO-Provost divide. The structural barrier is real: IT focuses on security and scalability; academics focus on pedagogy. Without executive-level coordination between these functions, you'll continue creating hidden taxes. Make this integration an executive priority.
3. Define AI competency in learning outcomes, not tool adoption. Purdue's approach—requiring AI working competency at graduation—is clarity. What should your graduates know and be able to do with AI? Build infrastructure and faculty development around that outcome, not the other way around.
4. Invest in supplementing, not replacing, human instruction. The Brookings research is clear: AI works when it supports teaching, not when it offloads student thinking. Design professional development around this principle. Train faculty to use AI as a co-teacher, not as a shortcut.
5. Prepare for the IMF's skill reality: 40% of global jobs face AI-driven change. Your students will graduate into a labor market where AI fluency is table stakes. Without systematic preparation, you're leaving them unprepared. Countries leading in AI readiness—Finland, Ireland, Denmark—invested in lifelong learning infrastructure. That's your blueprint.
A Final Reflection for Today
The Brookings report and the hidden AI tax are not predictions. They are happening now. Students are already paying. Their cognitive development is already at risk. Their equity outcomes are already diverging. The institutions moving intentionally—Purdue, Google partners, the IMF's top-ranked workforce prep countries—are not waiting for perfect policy. They are building bridges between IT and academics, defining learning outcomes, and ensuring equitable access. Your institution can choose that path. Or you can continue managing the chaos while students pay the price. The choice is yours. The time is now.
HigherEd AI Daily
Curated by Dr. Ali Green
Curated by Dr. Ali Green
Sources: Brookings Institution Center for Universal Education; Inside Higher Ed; Purdue University News; International Monetary Fund; NPR
Visit AskThePhD.com for AI governance frameworks, budget equity analysis tools, and cross-functional coordination templates.
Leading higher education through intentional AI strategy, equitable implementation, and human-centered design.