
Events | Students | Technology
Digital literacy in the AI era (Part 1)
As AI increasingly blurs reality online — generating lifelike images and believable misinformation — it’s essential that internet users learn to distinguish fact from fiction and spot reliable sources.
Sam Wineburg, PhD ’89, Margaret Jacks Professor, Emeritus, of Education at Stanford Graduate School of Education, says that it is incumbent on parents and educators to help students learn to be savvy consumers of information.
“Our students are living digital lives,” said Wineburg. “And it's our responsibility to help them navigate that terrain where they're spending so much time.”
Wineburg joins hosts GSE Dean Dan Schwartz and Senior Lecturer Denise Pope on the first live recording of School’s In as they discuss digital literacy in the school curriculum and the challenges and potential of AI in education.
“There's no place in the school curriculum for essentially bridging this gap between the lived experience of our students and what we teach in school,” said Wineburg, who is also the co-founder of Digital Inquiry Group, which conducts research and designs lessons for educators.
“What we’re trying to do with the Digital Inquiry Group is to create curricula that can be infused in all of the subjects of the school curriculum,” said Wineburg.“This is the only way that we're going to build a bridge between students' experience and how we want them to become informed citizens in a digital society.”
Learn more about our LIVE event and view the event recording on the Cubberley Lecture/School's In LIVE event page.
Sam Wineburg (00:00):
Our students are living digital lives, and it's our responsibility to help them navigate that terrain where they're spending so much time.
Denise Pope (00:12):
Welcome to School's In, your go-to podcast for cutting-edge insights in learning.
(00:18):
From early education to lifelong development, we dive into trends, innovations, and challenges facing learners of all ages.
(00:28):
I'm Denise Pope, Senior Lecturer at Stanford's Graduate School of Education and co-founder of Challenge Success.
Dan Schwartz (00:36):
And I'm Dan Schwartz. I'm the Dean of the Graduate School of Education and the Faculty Director of the Stanford Accelerator for Learning.
Denise Pope (00:45):
Together, we bring you expert perspectives and conversations to help you stay curious, inspired, and informed.
(00:55):
Hi, Dan.
Dan Schwartz (00:56):
Dr. Pope, it's good to see you.
Denise Pope (00:58):
It's good to see you too. I'm really excited.
Dan Schwartz (01:00):
Yeah. This is our first live recording in, in front of, uh, Stanford, so it's pretty exciting.
Denise Pope (01:05):
Yes. It is.
Dan Schwartz (01:07):
So, uh, it's digital literacy.
Denise Pope (01:08):
Ooh.
Dan Schwartz (01:09):
And, uh, I don't, I don't know what it is, so...
Denise Pope (01:13):
(laughs)
Dan Schwartz (01:13):
So...
Denise Pope (01:13):
Well, you're in the... You've come to the right place.
Dan Schwartz (01:14):
W- so here's my, here, here's my guess. I listen to Jane Austen digitally in my car, and so it's like, it's literature, which must have something to do with literacy, and it's digital.
Denise Pope (01:25):
Yeah. A-
Dan Schwartz (01:26):
Am I, am I there?
Denise Pope (01:27):
No. No.
Dan Schwartz (01:27):
No?
Denise Pope (01:28):
I... No. Uh, first of all, do you really listen to Jane Austen in your car? I don't buy it. I don't buy it. Do you?
Dan Schwartz (01:35):
I was a good humanities student.
Denise Pope (01:36):
Okay. Yeah, yeah, yeah.
Dan Schwartz (01:38):
I was good. But my, my minor was English in college.
Denise Pope (01:40):
You-
Dan Schwartz (01:40):
You didn't know that?
Denise Pope (01:40):
I... You know what? I did know that your minor was... And your major was?
Dan Schwartz (01:45):
God, I forget. It j- it must have been important. I was, it was philos-
Denise Pope (01:46):
It was philosophy.
Dan Schwartz (01:46):
It was philosophy.
Denise Pope (01:48):
I know.
Dan Schwartz (01:49):
Yes, it was philosophy. Okay. Okay.
Denise Pope (01:50):
This is a very literate person.
Dan Schwartz (01:51):
Okay. Okay.
Denise Pope (01:51):
Although not digitally literate, I think, as we've just proven.
Dan Schwartz (01:53):
Yeah.
Denise Pope (01:53):
So it's a really good thing that we have an expert here today. So Sam, thank you so much for returning to the pod and especially in front of this live audience. And I wanted to get us started with just, what is digital literacy and how do we, how do we know when someone is actually digitally literate?
Sam Wineburg (02:13):
Well, thank you. First of all, thank you for having me. It's an honor to be back, be, to be back on the campus. So what is digital literacy? On, on, on the most basic level, it's the ability to go online and discern what is true from what is sham. So a couple ex- a couple of examples were, uh, in Springfield, Ohio, were Haitians, uh, grabbing their neighbor's cats and stringing them up and barbecuing them? Um, this is a claim that ultimately made its way to 67 million Americans. And so the ability to look at something like that and decide, no, I think I better check that out.
(02:55):
Now, let me say at the, in the same breath, more recently what's gone viral is the claim that Elon Musk turned the election using Starlink for Trump. And that actually, actually the winner was Harris. So again, this is the kind of thing that can be established if you know what to do. But if you don't know what to do, you become part of spread- spreading digital pollution. And so that's j- at a, at a basic level, given how much time and how dependent we are on our digital devices, it's the ability to have confidence in information about whether to believe it or whether to kind of hold off and say, "Mm, probably not."
Dan Schwartz (03:40):
So this, so this is really tangential. I love the expression digital pollution. Do we know, like, what percentage of the internet is digital pollution?
Sam Wineburg (03:49):
An awful lot.
Denise Pope (03:50):
(laughs)
Dan Schwartz (03:50):
Like, like, 45%? Like, my, the chances that I find something false are, like, really high?
Sam Wineburg (03:56):
Well, again, we can think of widely believed views that millions and millions of Americans are holding. And so clearly this is not a minor problem. This is a problem that affects us. It affects all of the subjects that we teach in school. There is no area of the curriculum that is not infected, if you will, by the kind of misinformation that is being spewed on a daily, daily basis. Now, let's just think about our youth. Our youth are spending... Today's teenagers are spending on average, stay in your seats, outside of school, eight hours a day online. 39% of teenagers in a recent Pew study said that they are on social media almost constantly. 39%. So our students are living digital lives, and it's our responsibility to help them navigate that terrain where they're spending so much time.
Denise Pope (05:04):
So last time you were on the pod, you did a little test with Dan and I, and we failed. And it was, how do you know... You know, if you, if you're looking at something, what's the first thing you should do to sort of make sure that it's true or make sure that the facts are correct? And the answer, and I took this with me and I do it, is to look at Wikipedia. Is that still the same answer? Is that one of the things that you would say to do? Or have we, are we beyond that now?
Sam Wineburg (05:30):
That's not quite... If you check the podcast, that's not quite the response that I had.
Denise Pope (05:33):
Okay.
(05:35):
See? See? This is what I'm saying. We failed. We failed. We're still failing.
Sam Wineburg (05:37):
So, so, so let me, let me-
Dan Schwartz (05:37):
It's, it's funny. You know, all, all, all, all I, all I remember is that I got the wrong answer. I have no idea what he said.
Denise Pope (05:42):
You didn't even know he... Y- okay, okay.
Dan Schwartz (05:42):
I have no idea what he said.
Denise Pope (05:42):
That's so funny.
Sam Wineburg (05:44):
Let, let, let me, let me provide a small gentle correction.
Denise Pope (05:47):
Thank you. Thank you.
Sam Wineburg (05:48):
What... I asked you a question, if you come across a website... And let me give an example, and you can, folks, take this one at home, theinternationallifescienceinstitute.org, it claims to provide vetted health information that has been approved by a scientific board. You look at this website, you see that there are scientific reports. You see that there's a peer-reviewed journal. And I said, "What do you do when you come to a website like this?" And you started to go into critical thinking and you said, "I will look at the about page. I will look at the scientific advisors." And I said, "Wrong answer."
Denise Pope (06:27):
Yes, I recall.
Sam Wineburg (06:27):
Unless you have all of the time in the day, if you are not already a specialist and you are not up on nutrition information... You have a limited amount of attention. And so what you should do is you should engage in critically ignoring. Rather than critical thinking, you should engage in critical ignoring. If you don't know what something is, what you should do is to use this incredible device that we have, an internet in which sources are electronically linked, and leave that and use the power of the internet to gain some context of what that organization is.
(07:04):
And within a few seconds, if you leave it and open up multiple tabs, a process that we've observed with professional fact-checkers, which we call lateral reading, you can learn through context that this actually is an organization that is funded by big soda, by big candy, by the agribusiness. And so Wikipedia is one among several sites... Particularly, you can go to Wikipedia... Now, again, what does Wikipedia provide? If it's a well-trafficked site, there are references at the bottom. And if you recognize something that there's... To the Wall Street Journal or to AP or to the New York Times, whatever news-vetted organization that you can kind of corroborate, that's your first move rather than devoting all of this time on a site for which you are not an expert about the topic.
Denise Pope (07:53):
So true. So true. And I have used that, by the way. I, I... It's a, it's a great tool. And I have done this, I've done my lateral reading, so thank you.
Sam Wineburg (08:00):
You get an A.
Denise Pope (08:01):
Thank you, Sam. Thank you.
Dan Schwartz (08:02):
So, so, but I, I-
Denise Pope (08:02):
I try for A's.
Dan Schwartz (08:03):
I d- I think I do exactly the opposite.
(08:06):
Right? So I, I look up something... I get, I get something from the doctor, and they, some result, and I started looking it up, and I'm basically searching for anything that confirms my belief that I'm about to die.
Audience (08:18):
(laughs)
Dan Schwartz (08:19):
Right? And, and as opposed to, like, saying the, choosing the credibility of the source.
Sam Wineburg (08:25):
Well, I mean, it, it...
Dan Schwartz (08:26):
Sorry. Sorry, Sam. (laughs)
Denise Pope (08:27):
There's, there's no help. There's-
Sam Wineburg (08:27):
Dan, Dan, that's, that's dark. I'm not going there.
Denise Pope (08:29):
No, there's no help for that. There's no help for that. Okay. But no, so now we have this little wrinkle, folks. We have AI. And I have fallen prey to believing a made-up video. I have listened to a voice on the phone that I thought was a famous person, turns out it wasn't. And I like to think that I'm sort of... You know, I'd like to think that I have a brain and can critically think my way outta this, but AI is really, really convincing. So how has this changed what you teach people to do, if at all?
Sam Wineburg (09:02):
Well, first of all, let's recognize that the, the ground is shifting under our feet at this very moment. So anything that I say and respond at this moment probably will not be valid in a month from now.
Denise Pope (09:12):
Fair.
Sam Wineburg (09:13):
And it might even be less time than that. Sure. I mean, our lives are about to be radically transformed by AI, if not already. And the teachers among us and the students among us know that you are using it whether it is approved or not. It has implicated itself into our lives. Any student, uh, certainly is using it. And there is an enormous crisis right now, a crisis of confidence in education. And so one of the things that's happening, particularly with the development of our ability to express ourselves in writing.
(09:49):
Now, this is a basic cognitive ability, right? The, the, the famous polymath, uh, Nobel Prize winner Herbert Simon said that what creates learning is what a student does or a student thinks. And if you use AI and you put a list of points into ChatGPT and it writes the essay for you, then what actually has happened? It's like, if we could come up with a machine that bench presses 300 pounds, um, why not sit in the corner of the gym and drink a smoothie while we're watching that?
Dan Schwartz (10:21):
(laughs)
Audience (10:21):
(laughs)
Sam Wineburg (10:22):
Um, but nothing really accrues if we have something doing the cognitive work for us, so that's clearly a problem. But in the same breath, there are huge, huge potentials of AI, and we have to be extremely cautious. Now, again, before we leave some of the, some of the challenges, let's just talk about the point that you raise, which is how convincing it is. So recently I asked, uh, I asked Chat about a topic that I know something about and that I've written about, which is this whole kind of question of did the atomic bomb have to be dropped in order to end World War II? This is a deep moral value, hard historiographic issue that's hard to contend with. But there is supposedly a quote saying that the Japanese were ready to surrender. And I asked Chat, "Please verify this quote." And it came back with this exact quote and said, "You can find this in the Japanese communications." And actually, I know that's not to be the case. And so I confronted Chat and I, I actually, I had to print it out because its response is so precious.
Audience (11:38):
(laughs)
Sam Wineburg (11:38):
So, so indulge me for a second while I read it to you. "You raise a very valid, important point. My demonstrated tendency to misrepresent information, particularly through inappropriate quotations, raises serious concerns about my suitability as a tool for high school history education."
Audience (11:58):
(laughs)
Sam Wineburg (12:00):
Now, we can use anecdotes like this, and yes, it is prone to errors. It can make up things. Is it better to go AI, to go to, to ask a chatbot about a, a historical fact or to consult a textbook? At this point, probably, I'm gonna go with a textbook. But in the same breath, there are unbelievable potentials for harnessing AI to develop the kind of intellectual capabilities that school since time immemorial has been trying to develop.
Dan Schwartz (12:30):
So I, so I have, uh, uh, an honest question. So a, a number of my colleagues say, uh, the fact that ChatGPT writes essays is gonna destroy critical thinking 'cause the way to learn to do critical thinking is to write essays.
Denise Pope (12:46):
Wait, can we define critical thinking before we even answer that question? Can I ask you to do... Because people always say, "What do we mean by critical thinking?" Sam?
Sam Wineburg (12:53):
Well, I think everyone in the audience can talk about the development of being able to take, uh, disparate ideas and to form them into a coherent whole that is convincing, that's persuasive, that uses evidence. The problem when we start to talk about technology is this automatic tendency to say, "Well, if kids are believing things that aren't true, we need to teach critical thinking." And, you know, we need to... Forget about 21st century skills. Let's go to 4th century B skill-
Denise Pope (13:26):
(laughs)
Sam Wineburg (13:26):
... BC skills with Socrates in the Agora. Now, the thing is that, that Socrates didn't understand search engine optimization, didn't understand-
Dan Schwartz (13:35):
(laughs)
Audience (13:35):
(laughs)
Sam Wineburg (13:35):
... metalanguage, didn't understand algorithmic bias, didn't understand the way that Google arrays search results. There's a great deal of information about this tool that we use, this internet, that all of us are driving on the information highway, and none of us has read the informa- the driver's manual, and we still are going crazy with it. And so there's a lot of knowledge about how the information that we consume comes to us that is, has to be added to our traditional notions of just being able to critically think through a problem. We need to know about the information environment in order to make thoughtful decisions about what to believe.
Denise Pope (14:13):
So teaching students where this... How it works, where it comes from, about hallucinations. What else would you add to that list?
Sam Wineburg (14:23):
Well, again, you know, one of the things that we've seen... So we, we did a large study in 2021 where we had 3,446 high school students across this, across the United States hooked up to the internet and we gave them a series of tasks. And so here's one of the tasks. The students saw a grainy Facebook video that claimed to say that there was widespread vote cheating in the 2016 Democratic primaries. Now, this video was actually shot in Russia, a fact that if you know the right keywords and you open up another tab, you can find. Three students in 3,000 actually made it to the right place where that video was created. Now, there's clearly an issue here.
Denise Pope (15:13):
That's so depressing, Sam.
Sam Wineburg (15:16):
Three students in 3,000. So, you know, we have to do something, and the ways that we're going about it are not the right ways to go about it. Let's go back to the beginning of, of the, of the questions. You asked about digital literacy. Where is digital literacy in the school curriculum? Right now, if you find it, it might be in a single couple lectures by the school librarian, if there is one.
(15:41):
I wanna give you a statistic. We, with the Digital Inquiry Group, we count our downloads. So for our history curriculum last year, we had over a million downloads of our curriculum. We had 67,000 downloads of our digital literacy curriculum. Why? Because there's no place in the school curriculum for essentially bridging this gap between the lived experience of our students and what we teach in school. So what we're trying to do with the, with the Digital Inquiry Group is to create curricula that can be infused in all of the subjects of the school curriculum. This is the only way that we're going to build a bridge between students' experience and what we want them, how we want them to become informed citizens in a digital society.
Denise Pope (16:28):
Love it.
Dan Schwartz (16:30):
Good proposal. Good solution.
Denise Pope (16:35):
Sir, you have a question?
Live Audience Member (16:36):
Question is that you've, you've cited the study from 2021. As fast as all of these technologies are developing, do you worry that the relevance of that study diminishes over time as a result of the changes in the technologies and the speed with which people are beginning to get used to using them in different ways?
Sam Wineburg (16:59):
I would worry about it if the subsequent studies showed a fundamentally different result.
Denise Pope (17:04):
Mm.
Sam Wineburg (17:04):
Unfortunately, to my, to our great dismay, they don't. And so there's m- more recent studies, but I think that that leaves a kind of pessimistic sense that I don't want to hover above this room. What we've also done since 2021 is we have done studies where, uh, in... We did a treatment control, randomized control study in Lincoln, Nebraska public schools. And we showed that in less than six hours of instruction, students grew in, at, at a rate of 40% in their ability to make thoughtful choices about what to believe. Six hours is two hours less than the average amount of time a teenager spends online in one day.
Denise Pope (17:47):
Yeah.
Sam Wineburg (17:47):
So just imagine what could happen if this was put into the warp and woof of the way that, of, of the curricula tha- that we have. So again, we can move the needle. These things are not intransigent, they're not things that can't be taught. But they need to be taught.
Dan Schwartz (18:07):
So here, here's one I think people need to learn.
Denise Pope (18:09):
Okay.
Dan Schwartz (18:09):
Uh, so the new thing about the AI is it's really easy to make stuff, and it'll be very easy for students to generate fake news, so to speak. And so you need to teach them why you might not want to do that. Teach 'em to think about responsibility, ethics, right? So that, that seems different to me, the ability to, for the students themselves to produce things that would show up on the internet. Could...
Sam Wineburg (18:32):
I... No, I, I agree. I think that's, I think that's, that's part and parcel of it, of course. Absolutely.
Denise Pope (18:36):
Because we do such a great job teaching ethics already.
Dan Schwartz (18:41):
(laughs)
Denise Pope (18:41):
Right?
Dan Schwartz (18:42):
Right.
Denise Pope (18:43):
I mean, now we need a whole other curriculum.
Dan Schwartz (18:45):
So, so you guys are proposing we get rid of trigonometry so we can make space for this? Is that the...
Denise Pope (18:50):
No, I don't, I don't think it's an either or at all. But I do, I d-... You know, there are parents out there, there are teachers out there who just wanna know what should we do? I think downloading Sam's curriculum, great idea, right? I think teaching responsibility and making ri- good choices and not posting deep fakes and pornography and blah, blah, blah, whatever that's called now with the cyber sex. Um, great idea. Don't do that. Right? But I think we need more. Like I get... I'm getting duped.
Dan Schwartz (19:16):
No.
Denise Pope (19:16):
(laughs) Help me. Help me.
Dan Schwartz (19:16):
No.
Sam Wineburg (19:20):
So, so-
Dan Schwartz (19:20):
Wikipedia.
Sam Wineburg (19:22):
Dan, Dan, you said eliminate trigonometry. No, that's exactly not the way to think about it.
Dan Schwartz (19:27):
Okay.
Sam Wineburg (19:27):
The way to think about it is that we c-... It's not either or, it's both and. So, a, a, a, a student is, comes across, uh, the, uh, uh... W- what are the browsers of choice for gen, for this, for students in high school at this point?
Denise Pope (19:42):
Oh I know the answer to that.
Sam Wineburg (19:42):
They-
Denise Pope (19:44):
I asked that to, to a bunch of kids.
Sam Wineburg (19:46):
Okay.
Denise Pope (19:47):
Uh, the majority of this generation gets their news from, uh, TikTok and, um, uh, Instagram.
Sam Wineburg (19:54):
And one more.
Denise Pope (19:56):
Uh...
Sam Wineburg (19:57):
YouTube.
Denise Pope (19:58):
YouTube.
Sam Wineburg (19:58):
These are visual media. And so you come across a TikTok video claiming, um... Well, again, claiming, uh, that the Holocaust didn't happen. Claiming that there were 20,000 African Americans who were put into concentration camps by Union soldiers in Natchez, Mississippi in 1865. This is heartfelt claims. Now, there's lots of things that happened, particularly in our history, that textbooks don't report. So how do you know what's true? You're teaching the Civil War, you're, and you come across, a student says, "Did you know that hundreds of thousands of African Americans suited up in Confederate Greys and fought on the side of the Confederacy, fought for their own continued enslavement? Um, here's evidence."
(20:50):
Now, this, when you're teaching the Civil War, these are the kin-... How- if you are interested as a, as, as an educator in preparing students not for more school, but for the society in which they live, the information environment that surrounds them, then this is not an either or. That TikTok video has to be brought into the classroom. It has to be interrogated. And we have to teach students how to be able to discern whether that's something to belie- be believed, or whether in that particular instance, that is a lost cause piece of propaganda.
Denise Pope (21:25):
One hundred percent. One hundred percent. Why are you looking at me like that, Dan?
Dan Schwartz (21:30):
Uh, I'm, I'm-
(21:31):
... still trying to figure out what I'm supposed to do when that TikTok video shows up. I think I don't quite have lateral reading. So, like, so I see this, a student brings it, or I see it, I'm not quite sure. You know, my, my intuition is, well, let me rely on my common sense, and then I'll make up some facts that help me believe what I wanna believe. And, and... But you have a technique here that I don't quite know.
Sam Wineburg (21:54):
So, I mean, here's what's, here's... Let's, let's begin with a common thing not to do that is often, often used. A TikTok video of a guy who says, "You know, we ought to ban high school football, or we certainly need to regulate it, because there's all these kinds of problems with concussions." And you listen to it and you say, "Wait a second. I mean, this is fearmongering. This is, you know, kind of raising our heartbeat for something that's really overblown. Who is this dude?" And his name happens to be, uh, Chris Nowinski. And what do you do with that? You say, "Well, I r- it, it, it doesn't seem that convincing."
(22:37):
That's not digital literacy. Digital literacy is to be- is knowing how to use this incredible tool, really, that's at your fingertips, literally at your fingertips, and saying, "Let's find out who this dude is." Now, it happens to be a Harvard-educated Boston University PhD in neuroscience who wrote a book called Head Games that became the basis for the NFL fundamentally rethinking its whole policy toward concussions. And so, as opposed to some random person, like the video I referred to in Natchez, Mississippi, who just sees some kind of things... Actually, that particular claim goes back to a person who says they're an expert who is the head of a society of paranormal activity in Mississippi.
Dan Schwartz (23:23):
(laughs)
Denise Pope (23:23):
There you go.
Sam Wineburg (23:24):
So again, should we believe her? She kind of conjures spirits to know what happened in the past.
Denise Pope (23:30):
Okay, wait, I have a question. This might, this is, could potentially be very embarrassing in front of all these people. I thought that people don't necessarily use their real name. So if I see this person on TikTok, all I'm seeing is a handle, like a made up, like, Miss Piggy kind of handle. And then how do I google to see who the person is? Am I, am I wrong?
Sam Wineburg (23:49):
Well, again, if there's, if it's anonymous and they're talking about Natchez, Mississippi, and they're talking about the Devil's Punchbowl, and they're talking about eight- uh, 1865, then you've got some keywords.
Denise Pope (24:00):
So that's what you meant by keywords.
Sam Wineburg (24:01):
And now, now let's, let's-
Denise Pope (24:02):
I gotcha.
Sam Wineburg (24:03):
... let's talk about AI for a second.
Denise Pope (24:04):
Okay.
Sam Wineburg (24:05):
Let's put this into Chat. Let's put this into Claude. Let's put this into Perplexity. Let's put this into Gemini. And if you s- and you ask for sources, you ask and you give the kind of prompts... Not just say to your chatbot, "Is this true?" But in addition to that prompt, and this is where we have to start to think about the new kind of education that our students need. How do we kind of think like the large language model and prompt it for the kind of solid answer that we want it to produce? So we ask not just, "Is this true?" But provide the sources for your information and preferably provide academic and scholarly sources. You get a very different answer and a much more thorough answer than if you just say, "Is this true?" Now, if you wanna then look at a source, then you go back to the source and say, "Is Breitbart a, uh, uh, uh, a solid source I should believe? Or is, um, something that is published by Stanford University Press more of solid ground to stand on?"
Denise Pope (25:11):
I'm not gonna answer that question. That sounds like a trick question. All right. So, um, tell... You, you're, you're a mom, you're a dad, you're a parent. What do you want the parents to know? Do you want the parents to be sitting with them and saying, "Hey, don't be spreading around TikToks that aren't true"? Do you want the parents to be teaching? Like, what, what's, what's the parent's role in all this?
Sam Wineburg (25:36):
So let's think about a hierarchy of harm.
Denise Pope (25:42):
Okay.
Sam Wineburg (25:44):
So what are the things that are most potentially harmful that our children are doing that we ought to know about? And there was a report, an exposé, in the April 26th Wall Street Journal about, uh, sex bots on Meta, uh, Meta AI, where there are user-generated sex bots. One of them is named Submissive Schoolgirl. And the Wall Street Journal reporter Jeff Horwitz, over a period of months, created scenarios where they, where he played the role of adolescents. And essentially what this chatbot engaged in was providing a menu of sexual and bondage fantasies and drawing adolescents into this kind of world.
(26:38):
So let's just be clear. There i- there are ever present dangers out there that if we don't know what our children are doing, it is, it's, it's not an easy environment to be in. So let's talk about that for a second with AI. Now, if our student is circumventing all of the assignments in school and using ChatGPT for them, then we need a kind of heart-to-heart, come-to-Jesus talk about what's the purpose of school and how do you develop that muscle? And do you wanna kinda send a robotic chat ch- chat into, into the gym to do the, the, the bench presses for you? And what will happen if you do that? Will you grow? And so there are a lot of things that we need to do as parents. Uh-
Dan Schwartz (27:28):
Sam. Can, could you make a TikTok video of that speech?
Denise Pope (27:31):
(laughs)
Dan Schwartz (27:32):
I think it would go viral to everybody who wants to be able to make that speech to their kids.
Sam Wineburg (27:37):
Onl- only if people then look up who I am-
Dan Schwartz (27:40):
(laughs)
Denise Pope (27:40):
(laughs) Ah.
Audience (27:40):
(laughs)
Sam Wineburg (27:40):
... and whether I am qualified-
Denise Pope (27:41):
That's so good. That's so right.
Sam Wineburg (27:43):
... to give this particular opinion.
Denise Pope (27:44):
That's so right. That's so right.
Dan Schwartz (27:45):
Very good.
Denise Pope (27:45):
Oh my god. Well, we could go on forever, uh, with this kind of talk, but we do have a panel, uh, coming up after this, so we could go on forever. Dan, I always put you on the spot for some takeaways for our audience.
Dan Schwartz (28:00):
So it, it's a little surprising. Digital literacy, if I've got it right, is knowing to track the source, figure out where it came from, which is quite different than sort of understanding different genres of digital media.
Denise Pope (28:12):
Yes. Hundred percent. Sam?
Sam Wineburg (28:16):
The takeaway that I want all of us to have is that our eyes deceive us. That there are, there are forces out there that wanna dupe us, that, that have a lot of money behind them, and to... If something looks good, the classic, if it quacks like a duck, if it walks like a duck, if it smells like a duck, it might be a duck, but it might not be. So the idea... We are called upon to make all kinds of decisions as a citizen, for things for which we lack the requisite background knowledge. Friedrich Hayek, the economist, said that the mark of a developed situation is being able to benefit from knowledge we don't possess. Now, the internet allows us, if we know how to use it, to be much smarter than we actually are. But we have to know how to use this incredible tool that's at our disposal.
Denise Pope (29:15):
I...Sam, it's so important. Thank you so much for that. I, I'm, I'm gonna leave here walking away not just with my Wikipedia from last time, but with this knowledge that I am- I can't possibly be an expert in everything I'm going to, uh, see. And so I've gotta do my homework, legit, to figure out what's right and what isn't right, and to really go deep and check the sources. And that means, that's time and energy. But my gosh, if we don't do that, we're gonna end up being duped as, as, uh, you know, as we see many, many people being.
Sam Wineburg (29:47):
Can I alla- allay your fears for a second?
Denise Pope (29:49):
Yes, please.
Sam Wineburg (29:49):
And go back to the example that I started off with, the International Life Science Institute. Denise, 30 seconds. 30 seconds. I mean, we tend to think of sources as in an environment of scarcity. We're in a very different environment. We're in an environment of overabundance. And so our critical thinking ability, the first step of critically thinking is to determine whether the object of thinking is worthy of thinking about. And in the case of, of so many things that establishing whether you should stay on that website or whether it's better to find, quickly find a- another source is a half a minute of your time if you know what to do.
Denise Pope (30:34):
I love that. Sam, that gives me hope. You've just given me hope in this crazy world. I thought this might be a depressing topic, but actually I leave here very hopeful. So Sam, thank you. Thank you for being on the show again, uh, I would love to have you back. Thank all of you for joining in this episode of School's In. It was so much fun to do. Be sure to subscribe to the show on Apple Podcasts, on Spotify, or wherever you tune in. I'm Denise Pope.
Dan Schwartz (31:00):
I'm Dan who does not have enough time Schwartz.
Denise Pope (31:02):
(laughs) But that was allayed, Dan.
Dan Schwartz (31:03):
(laughs)
Denise Pope (31:04):
We allayed your fears. 30 seconds. 30 seconds for the truth. Thanks again, folks. Really appreciate it.
Faculty mentioned in this article: Sam Wineburg