Short on Time? Essential Links
Daily AI Briefing for Educators
HigherEd AI Daily
Wednesday, January 8, 2026
Good morning, educators. Today's briefing marks a significant inflection point: AI has moved from being a tool institutions debate about to a foundational infrastructure reshaping how teaching, learning, and campus operations function. The shift is particularly visible in two major developments: Google's expansion of Gemini capabilities in classrooms and emerging research on what comes after the AI boom—emotionally intelligent systems. Both signal that the era of asking "should we use AI?" has ended. The question is now "how do we use it responsibly?"
Today's Focus: From Tools to Infrastructure
Google Scales Gemini in Education: Podcast Creation Now Available
Google launched a new Gemini-powered feature in Google Classroom that allows educators to generate podcast-style audio lessons directly from course materials. The tool became broadly available on January 6, 2026, with full rollout visibility expected within one to three days. Teachers can specify grade level, topic, learning objectives, and audio format (interview-style, conversational discussion, etc.) to customize content for their students.
This expansion matters because it targets a documented shift in how younger learners consume content. Data shows 42% of Gen Z and Millennials' screen-free time is spent listening to podcasts, with 35 million Gen Z listeners each month in the US alone. By embedding this capability into Google Classroom, Google is meeting students where their attention naturally goes: audio-first learning.
However, Google also issued a clear warning: educators must carefully review all AI-generated content before sharing with students. Errors are possible. Users should adjust content to fit classroom needs and local policies. Access is controlled by account administrators and limited to users aged 18 and above based on institutional settings. The feature is available on Google Workspace Education Fundamentals, Standard, and Plus plans.
Worth Noting
Many teachers remain cautious about AI in education despite these new capabilities. Concerns center on student over-reliance on AI for completing schoolwork rather than engaging in deep learning. This tension between capability expansion and pedagogical concern is the defining challenge of 2026.
WGU Labs: The Next Wave Is Emotionally Intelligent AI
Research from Western Governors University's research arm reveals that AI has transitioned from novelty to foundational infrastructure for student success, operational management, and institutional competitiveness. More significantly, WGU Labs predicts four major shifts in 2026.
First: Emotionally Intelligent Technology
Institutions will prioritize AI systems designed to deepen human connection rather than replace it. Research from MIT, University of Pittsburgh, and other institutions found that classroom use of AI lowered brain activity and increased student anxiety and confusion. Teachers reported concerns about losing instructional autonomy and human connection. In response, WGU Labs predicts the push for ethically designed, emotionally aware tech will gain momentum in 2026. "The next generation of technology will aim to rebuild what the last era of digital tools too often eroded," said Betheny Gross, director of research at WGU Labs.
Second: More Sophisticated AI-Powered Assistants
Provosts and administrative leaders are using AI to train specialized virtual assistants for specific campus functions. Some institutions are managing extensive operations through these systems; others are focused on narrower functions. The capability gap will widen in 2026, creating competitive advantages for institutions that build sophisticated, well-governed AI support systems.
Third: Rise in Private Education Providers
Alternative program providers and employers are using AI to compete directly with colleges by offering personalized pathways for non-degree credentials. Students—who have embraced large language models more enthusiastically than faculty—may increasingly choose platforms like Coursera's ChatGPT integration. Coursera's recent decision to increase university partner fees signals the company is operating with confidence as a standalone education provider.
Fourth: Deeper Focus on Non-Credentialed Adult Learners
As AI disrupts more jobs, working adults will seek to upskill with new credentials. Barriers to traditional higher education enrollment remain significant. Colleges are responding by building networks of nonprofit partners to enhance education pipelines. Organizations like Jobs for the Future can help learners identify emerging career opportunities and navigate pathways.
Also Today: State-Level AI Education Policy Accelerates
In Tennessee, leaders are actively shaping AI policy across education levels. Local universities continue adopting and amending AI-use policies for students, faculty, and staff. At the K-12 level, the state legislature passed the Teen Social Media and Internet Safety Act with bipartisan support, requiring the Tennessee Department of Education to create middle and high school curriculum on evaluating AI-generated content. State Rep. Andrew Farmer has introduced legislation requiring schools to provide instruction on AI access, utilization, and evaluation.
Vanderbilt University researchers are examining how children interact with AI in school, advocating for AI-related curriculum starting in elementary school. At the administrative level, state Rep. Scott Cepicky suggested using AI to speed grading for the Tennessee Comprehensive Assessment Program, while also recommending state audits of the proposed system—a pragmatic acknowledgment that AI automation requires oversight.
The Tennessee Artificial Intelligence Advisory Council released an action plan in November stating: "AI is no longer theoretical; it is already transforming how government delivers services, how businesses operate, and how Tennesseans work and learn. Delay risks ceding competitive advantage and missing opportunities to increase efficiency and improve lives." The plan prioritizes AI-related scholarships, apprenticeships, and training through community colleges and recommends a statewide AI literacy framework.
What This Means for Your Institution in 2026
1. Audit your current AI systems for emotional intelligence and human connection. Does your AI implementation deepen or erode meaningful human interaction? If your chatbots, tutoring systems, or administrative tools are creating anxiety or confusion rather than support, they are working against your institutional mission.
2. Plan for sophisticated AI-powered assistants in priority administrative areas. Rather than adopting AI broadly, select one or two high-friction areas where AI can meaningfully improve operations. Train the system well. Govern it carefully. Measure outcomes. Then expand strategically.
3. Prepare for competition from alternative providers. Students will increasingly have options beyond traditional degree programs. If you offer strong non-degree credentials aligned with labor market demand, you can compete. If your credentials feel disconnected from employment realities, students will go elsewhere.
4. Build partnerships to serve non-traditional adult learners. Do not assume your traditional infrastructure serves working adults seeking to upskill. Partner with community organizations, nonprofits, and employers to create accessible pathways. This is where growth lives in 2026.
5. Get ahead of policy changes. States are moving quickly on AI education policy. Review Tennessee's approach and the federal guidance from NTIA. Your state will follow. Having a governance framework in place before legislation mandates it puts you ahead.
A Final Reflection for Today
The conversation about AI in education has matured dramatically in just three years. We have moved from "Will students cheat with ChatGPT?" to "How do we ensure AI systems strengthen human connection rather than erode it?" That is genuine progress. It signals that institutions are thinking beyond adoption and into impact. The research showing that AI can lower brain activity and increase anxiety is not a reason to abandon AI. It is a reason to be intentional about how we design and deploy it. Emotionally intelligent AI sounds like corporate jargon, but it really means this: Does this system make learning better for actual humans, or does it just make operations cheaper for institutions? In 2026, that distinction will matter more than ever.
HigherEd AI Daily
Curated by Dr. Ali Green
Sources: University Business, Economic Times, WGU Labs, Nashville Scene, Benton Institute, Inside Higher Ed, Google Education
Visit AskThePhD.com for governance frameworks, implementation guides, and faculty development resources.
Helping higher education leaders implement AI thoughtfully, govern responsibly, and lead with confidence.