Schools have a role in helping students navigate AI and fake news, Stanford panelists say
As AI increasingly blurs reality online — generating lifelike images and believable misinformation — it’s essential that internet users learn to distinguish fact from fiction and spot reliable sources.
Schools have a unique chance to equip students with the skills to handle today’s digital landscape, said Sam Wineburg, PhD ’89, Margaret Jacks Professor, Emeritus, of Education at Stanford Graduate School of Education (GSE), during the 87th annual Cubberley Lecture on May 21, at Hauck Auditorium.
“Our students are living digital lives,” Wineburg said. “And it's our responsibility to help them navigate that terrain where they're spending so much time.”
A crowd of nearly 300 teachers, students and community members gathered for the event, which included the first on-campus live recording of the GSE’s School’s In podcast hosted by Dean Dan Schwartz and Senior Lecturer Denise Pope, ’88, PhD ’99, on the topic of digital literacy. Sam Wineburg served as the expert guest.
“Education must play a role in helping students become critical thinkers who can sort fact from fiction, or engage in honest debate using trustworthy sources,” Schwartz said. “This is something that is now more urgent than ever.”
The live recording was followed by a panel discussion moderated by Pope, and included panelists Alvin Hong Lee ’25, a Stanford political science student; Janine Zacharia, a journalist and lecturer in Stanford’s communication department; Valerie Ziegler, a teacher of history, economics, and politics at Abraham Lincoln High School in San Francisco; and Wineburg.
Pope helped guide the conversation by elevating the concerns and challenges of students, parents, and educators.
“There are parents (and) teachers out there who just want to know, ‘What should we do?'” said Pope, who is also co-founder of Challenge Success, emphasizing the importance of responsibility, ethics, and understanding in navigating the internet.
The need for digital discernment
With attention spans shrinking and online information that is vast and varied, Wineburg said it’s more important for internet users to spend time first verifying a source’s authority before consuming information and later trying to determine if it’s true.
“We're in an environment of overabundance,” Wineburg said. “The first step of critical thinking is to determine whether the object of thinking is worthy of thinking about.”
Wineburg suggests a source can often be verified in less than 30 seconds by leaving the page or site in question and checking other websites — a process called lateral reading, commonly used by professional fact checkers.
“If you don't know what an organization is, or where its information is coming from,” he explained, “use the power of the internet to gain some context.”
This ability to determine trustworthy sources extends to news, especially since young people are increasingly turning to social media platforms like TikTok and Instagram for information.
“I think that it's an urgent national priority that people understand how credible, fact-based news works and how to identify what you're seeing in the news,” said Zacharia, who teaches journalism skills and techniques for understanding the changing news environment, during the panel.
“In terms of identifying credible information, if you look at the sources, as opposed to reading everything that captures your attention, or echoes your confirmation bias without caring who they are, we’d all be in a better place,” she said.
Bridging the gap between students’ digital and educational lives
With American teenagers spending an average of eight hours per day on the internet outside of school, Wineburg says that it’s incumbent on educators to create a link between student experiences in and out of the classroom.
“There's no place in the school curriculum for essentially bridging this gap between the lived experience of our students and what we teach in school,” said Wineburg, who is also the co-founder of Digital Inquiry Group, which conducts research and designs lessons for educators.
“What we’re trying to do with the Digital Inquiry Group is to create curricula that can be infused in all of the subjects of the school curriculum,” said Wineburg, whose research has been used to create classroom materials used by over half of American history and social studies teachers.
“This is the only way that we're going to build a bridge between students' experience and how we want them to become informed citizens in a digital society.”
Another way educators can equip students is by becoming familiar with AI, creating discussions around its use, and incorporating it into classrooms.
“I think the first thing is that, as educators, we have to practice what we preach, and use these tools,” said Ziegler, who was a California Teacher of the Year in 2010.
“What's great is that when you put a bunch of students together to ask AI a question and they're all doing it … and they all get different answers on the same topic, at the same time, then you have a conversation,” she said. “You then have this ability to say, why are we all getting different answers? What does that mean? How do we go look at these sources? How do we ask the questions in the right way to get where we want to get?”
Panelists offered another tip for teachers: tap into students’ existing knowledge base.
“I think Gen Z is better equipped than other generations in navigating the complexity of disinformation online,” said Lee, who is founder and executive director of GENup, California’s largest youth-led education policy organization. “But I think by and large, we still really need to hammer in the importance of digital literacy very early on in our public education systems to make sure that we're really addressing this crisis.”
Panelists also suggested that parents and other caretakers have a role.
“Model good behavior,” Zacharia said. “Ask [your children] how they know things …Or ask them to show you the TikTok [they're watching]. ‘Oh, what is that?’ ‘Oh, who is that person?’”
Using AI as a learning tool, not a tool in place of learning
One of the biggest concerns with using AI in schools has been the temptation for students to use AI chatbots to generate answers for them, or to write essays that don’t reflect their actual understanding or skill.
During the Q&A, Ziegler shared a colleague’s story about a student who had been excelling in class assignments until they were prompted to write an essay as part of an exam that did not allow them to use the internet. The student “bombed” the assignment and admitted to previously using AI generated materials.
“She told the student, well here we are — you can’t write because you haven’t had to,” Ziegler said. “There’s content that’s learned in high school, but there are also skills. And our job as educators is really to equip students with the skills to be successful outside of the classroom.”
Educators can also encourage the thoughtful use of AI without neglecting the act of learning, Wineburg said. "I do not want to convey the sense that we should ban large language models."
Wineburg emphasized that these new technologies have the potential to sharpen our critical reasoning — but only with the right guidance. "The internet allows us, if we know how to use it, to become much smarter than we actually are, but we have to learn how to use this incredible tool that's at our disposal.
Faculty mentioned in this article: Dan Schwartz , Denise Pope , Sam Wineburg