Skip to main content

NEPC Talks Education: An Interview With Faith Boninger and Andrew Liddell About Edtech in K-12 Schools

University of Wisconsin‑Madison Assistant Professor Christopher Saldaña interviews Faith Boninger and Andrew Liddell about the widespread adoption of education technology in schools and its unintended consequences.

Transcript

Please note: This transcript was automatically generated. We have reviewed it to ensure it reflects the original conversation, but we may not have caught every transcription error.

Christopher Saldaña: Hi everyone. I'm Chris Saldaña and this is the National Education Policy Center's Talks Education Podcast. In this month's podcast, we're speaking with doctors Faith Boninger and Andrew Liddell. Dr. Boninger is the publications manager and co-director of NEPC's Commercialism in Education Research Unit. 

Andrew Liddell is a career federal courts litigator and technology attorney. So for listeners who aren't familiar with how technology is used in schools. What kinds of questions do each of you ask when you do your legal work or research about education technology?

Andrew Liddell: I think a lot of parents really don't understand the degree to which education technology is now being used in schools.

Certainly if you don't have young kids and your kids were able to go through school before the pandemic. But now so much of school has moved to one-to-one devices. So every kid gets a computer and this is starting at an earlier and earlier age. And so it's not uncommon for children as young as kindergarten to be given their own iPad.

Usually, or sometimes it's a Google, Chromebook. And, some significant portion of their schooling is done on that device. And so from a legal perspective, what I always want to know is what programs are you using? What is the evidence that they're safe? What is the evidence that they're effective?

Faith Boninger: Those are excellent questions. And those are the kinds of questions that I ask. I'm curious about what technology is being used in schools, how schools decide. What products to adopt and what the implications are of using ed tech products in general or in particular? For pedagogy, for school processes and governance and for student and teacher data privacy.

Christopher Saldaña: If we think about what you both have found in asking these kinds of questions, Andy, in terms of your legal work, what have you found has been the approach that has been taken by school districts and corporations to try to put forward this use of technology in schools?

Andrew Liddell: It has been far more haphazard than I think most people would be comfortable with. It's not a situation where schools have the capacity to sit down with every single potential vendor and understand their practices, understand their data practices, understand their safety practices, and actually have an objective, vetted evidence that these companies representations are true. A lot of times it is just an adopt first and ask questions later or troubleshoot down the line. And schools don't have the resources to do that. Part of the reason that schools are eager to adopt educational technology is because they lack the resources.

And so these companies come in and oftentimes sell schools on, oh, this is a great way to, to save labor and save time. This is a way to personalize education for each student. And they're so vulnerable to these claims that the same reason that they're trying to adopt this technology in the first place is also the reason that they're not able to properly vet it.

Christopher Saldaña: Faith, have you found similar claims in your research?

Faith Boninger: Yes, absolutely. And that haphazardness is really a problem and that's why my colleagues and I stress that there needs to be policy that supports schools and districts, because they don't have the power, the manpower, the expertise to do the kind of vetting that's necessary.

Christopher Saldaña: Faith when you've looked at some of the instances of adoption of technology in schools, what has that looked like and what kinds of claims are being made and how does that actually run up against the actual outcomes?

Faith Boninger: Oh, that's a very big question. So the thing is that ed tech now is assumed to be necessary, right?

So there's that underlying piece that this is what's supposed to be and the claims have to do with being able to personalize education, being able to save teachers’ time, being able to provide a personalized tutor. For every child. And those, they don't it doesn't turn out that way.

There's no real evidence that personalizing education is helpful. Personalizing is a great word, but that doesn't mean to make it more personal. It means to make it digital. And to separate kids so that, each one is working on their own device. Often a Chromebook and you have an algorithm that's determining what they learn and how they're evaluated.

Teachers devote time to other things, right? They'll devote time to making sure that the kids are focused on the task, on the computer. They'll make, they'll devote time to, making sure that everybody is online appropriately, that there aren't any technical problems. It's just a shifting of what they do.

And the notion that tech can provide an individual tutor is also pretty silly. It doesn't work out that way. The claims are much more expansive than the reality.

Christopher Saldaña: So given that, that these claims are being made in ways that aren't necessarily backed up by the outcomes and what we're actually seeing in schools.

Andy, what would you say are the underlying concerns that come up that come with implementing the technology? For example, Google's coming in and saying, we're gonna do all of this for you. And at the same time they’re acknowledging that, yeah, there may be some costs, right? There might be some things that we have to deal with while we're going through this implementation.

But given that we know that the technology hasn't been as impactful as it claims to be, what are some of the concerns that schools and school districts and parents are raising about the implementation of different types of technologies like Chromebooks or different kind of Google apps in schools?

Andrew Liddell: From the parent perspective, because I think that right now most parents and most school districts are not always necessarily on the same page, but from the parent perspective the concerns relate to privacy and safety. We have these companies that are observing children beginning in kindergarten or sometimes even earlier, and they observe them throughout their entire life on their school computer, which is oftentimes used at home.

They collect an enormous amount of data, which they sometimes tend to disclose in their privacy policies. They share it with countless third parties, which they often also disclose. And so if you think about every single thing you do on a computer, whether it's an intentional observable action or something that the computer can observe, but a person can't.

All of that information is being collected and it's being used in ways that we just don't know. And so there is this level of opacity there that is really concerning. We do know that a lot of these companies purport to make these prediction products that are used for schools and maybe for others we don't know.

And you think about products that predict whether a kid is gonna have trouble in school, whether they're at risk for not graduating or not advancing. And one of the things that concerns me and that I, I'm sure Faith can speak more to this, is how those predictions encode existing levels of bias into the predictions. And so we have a prediction that these kids are unlikely to graduate. What the algorithm is actually saying is these kids are probably poor and probably from a racial minority. And so it just continues this cycle of disenfranchisement and exclusion from positive intervention that is concerning.

We also have kind of less heady, more immediate concerns about safety. Without speaking to any specific provider, but to say that Apple, Google, Microsoft, all put their products into schools, and we've had parents reach out about all of them where their young kids are getting hurt in immediate ways through their school computer that parents thought the school computer was safe.

And so we have young children being victims of online enticement, online sexual abuse. We have kids that are being cyber bullied relentlessly by their peers into self-harm. We have kids that are falling victim to sextortion because their computer, which is the only computer that their families give them, they don't have a phone, they don't have a computer.

Connect with somebody online, are encouraged to take naked pictures of themselves, which they then sit back and then are immediately sex-extorted. And we know that, in many cases I've worked with these families as well, not necessarily through the school context, that when a young person is a victim of extortion, they're very likely to commit suicide.

And so we have these. Dangers that are just coming into people's houses and they have no idea. Because never before in history has school given you something that's just so crazily dangerous, or if it was, you would know. But right now people just don't know how potentially dangerous the computers coming home from school are, and of course administrators are gonna say we try and lock it down.

We try and make it safe. But the companies themselves have an obligation here. They know they're selling to kids. They know this is going to kids as young as five. They know they have an obligation to support their educational attainment. And they're not designing to do any of that. These products are the same things.

You could walk down the street to buy from Best Buy. And that's just developmentally not appropriate for most kids until really late in high school. 17, 18, and even then I would say that's even those kids deserve a little more protection from the internet than they're currently getting.

Faith Boninger: It's really pretty impressive. The extent to which schools now lead children into this unsafe environment. And I wanna underscore what Andy said about the opacity and the lack of accountability. It's really a problem. You have no idea what's going on behind the curtain of any ed tech product that your kids are using.

You don't know. What data's being collected, you don't know where it's going. You don't know how it's being used. You don't know how the decisions are made. Going back to this idea of personalizing learning or making predictions about the likelihood of success, those, they're algorithmically driven, these things.

Teachers, parents, administrators have no idea how it works. And you get to a point where a parent can go to a teacher and say, why did my kid get this grade? And the teacher doesn't know, and that's not even so the teacher can't explain, you can't have a conversation about. Why the child is recommended to one track as opposed to another track.

And that's not even, that's just the educational piece. It's not even dealing with these life or death things that Andy's talking about. Like who would've ever thought that we'd be talking about life or death and child abuse and all these kinds of things. Based on school curriculum.

Christopher Saldaña: You've both mentioned already that there really is a lot of unknown that exists. When technology is introduced to a classroom, what does the process look like for school districts to communicate technology to parents? How do they tell parents, this Chromebook is coming into your students' classroom, we are gonna use these apps.

What sort of consent is being asked for when a child is being exposed to these kinds of new technologies?

Andrew Liddell: The disclosures are pretty thin, and the consent that's being asked for is, I. Illusory and legally ineffectual. Usually what happens is a district will have an acceptable use policy, and that essentially is a contract that they make the kids sign and their parents sign about.

This is how you will treat this computer and you will not go to bad places on the internet, and you will arrive with your computer fully charged and you won't stick paperclips in the USB drive to make it catch on fire. Things like that. As part of that, they say, and also you consent to all the data practices of anything we might contemplate using.

And so I have yet to see a school district say, this is the total and exclusive list of EdTech products that we use. Here are all their privacy policies. Here's what they're allowed to take from your kid, and you're free to take your leave. You're free to consent or not. I've not seen that.

And because that's just extremely burdensome, right? That's just not realistic in how this has already been adopted. I do think that should be the standard. I think that we absolutely should be disclosing all of these things and we're just not.

Faith Boninger: Lately I've been trying to understand the law in the state of Colorado and how it's implemented.

So Colorado had one of the most extensive ed tech laws passed. I think it was 2016. It is so full of holes, so theoretically contracts DPAs have to be DPAs, excuse me, data privacy agreements for EdTech products legally have to be posted on schools, websites, and district websites.

But that's only the case if there's actually a contract. That the district signed. But districts can use what they call click through agreements to adopt product. And in that case there is no DPA right there. There's and there's no oversight by the state. The state can't, is completely prevented from overseeing what's happening.

So going back, just going back to the business about the click through agreements, if you're a district that works with Google. That's a click through agreement. So there's it's not public in the way that Andy was talking about that information. And it's not regulated. It's really problematic.

And this is a state that theoretically has a law to address this.

Christopher Saldaña: So you've both already mentioned some big implications from tech, some really dangerous ones, life or death ones, and some implications for the classroom. Just in general, how it changes teaching, how it changes, how students learn, when these kinds of concerns are brought to the technology providers.

And we won't name anyone in particular, but when the concerns are brought when folks start to realize, there are some real problems that come with this kind of technology. What sort of responses have you seen from the providers? Do they typically say, oh yeah, you're right. Let's correct this or what does that typically look like, the response.

Andrew Liddell: If it was easy to bring these concerns to the providers or even the school decision-makers, I would not have started this law firm. So they're really unresponsive. And it's, we've got a mutual friend, Faith and I, Emily Chikin who says this frequently that ed tech is just big tech and a sweater vest.

And I think that's really true and right on the nose. And that sweater vest doesn't even provide much cover, 'cause frankly, they're the same companies. Google is, YouTube is Google Classroom, and the Bill and Melinda Gates Foundation gave a ton of money to this. The Chan Zuckerberg initiative gave a lot of money to this.

Apple is in classes, especially in elementary schools. And we've seen through whistleblowers and congressional hearings and the social media victims lawsuits that these companies don't care about their young users. They just don't. They don't. They have a, they have an obligation to their shareholders to make money, and there aren't any laws stopping them from doing what they're doing, and so they're gonna do with what makes them money, and if people get hurt or die in the process...

So what? And so that same callousness is really present when it comes to schools. It's not as though they're sitting down and carefully looking at, oh, what does a five-year-old need and how can we design a computer from the ground up for that child to support their development, keep them safe and help them learn?

And how is that different from maybe somebody who's eight or nine or 10, or somebody who's 13 or someone who's a senior in high school? None of that is happening. None of that is happening. And I think because that's such a stark departure from how school normally is, if you think about, go into a kindergarten classroom and then go into an AP chemistry lab and look how different the physical environment of those two classrooms are and how much that reflects the different needs of those students for learning and also for safety.

We're not doing that with technology at school. It's basically we're letting everybody run amuck in the chemistry lab and it's very dangerous.

Christopher Saldaña: I wanna ask you specifically about AI or artificial intelligence. Are the claims being made about AI anything new, or is this more of the same old sort of sales pitch?

Faith Boninger: Both are true, right? It's the same old sales pitch. There were teaching machines back in the 1930s. They've been pushed on schools for a really long time. There's been ed tech pushed on schools for the last two, three decades. AI turbocharges the whole thing, right? It's more opaque.

Even the developers don't know what's going on. There of generative AI, right? It's being pushed now in a way that's really incredible, comes from the tech industry, right? But policymakers are on board. Media seems to be on board. Everybody's on board, and critical voices are - I don't, extinguished isn't the word I'm looking for, but certainly ignored and downplayed.

Because there are real concerns here, and we have to take it seriously, like the thing that I hear often is AI is not going away. We all have to get with the program pretty much, right? It's not going away. So we really have to think carefully about this.

We have to establish policy and regulation and ask hard questions so that if it's brought into a school, we understand very well what the implications are and we can control those implications. But the idea right now is, just let it go because, it's happening and we have to compete with China and our kids need to be competitive.

That's really bad. In the big beautiful Bill, there's a line that prohibits states and other jurisdictions from regulating AI or from enforcing any regulation that they've already passed. If that goes through that's really dangerous.

For 10 years, right? Yeah. So it's on the one hand, same old, on the other hand, very different. And we live in an environment now where data is gold, right? Where everybody is motivated to collect as much data as they can.

Andrew Liddell: And AI changes the nature of that collection. Where essentially right now, so much of our behavior is inferred through our searching and our interactions with our computers and our phones and browsers and stuff, our location.

But when you change the interface from one of, I, I want to find this thing, I want to find this information to this AI interface. It's much more conversational. It draws out things that used to be presumed about you. Spotify and Facebook could infer when you were depressed and send you an ad that's gonna make you more likely to buy something, which are real examples.

But now with generative AI, it's trying to draw you in and have that conversation. So you just tell it, Hey I'm depressed. And the idea that we are just letting kids pour out their inner lives to these machines that we know. The only purpose of this is to collect information about you that can be used against you later is just mind-boggling to me.

And the idea that, you know the same people who say that it's here. AI is here, it's not going away. I ask them, what happened to MOOC? Massive online open courses or NFTs or all of this, just bs, come outta Silicon Valley. I think we can push back and say it's not here to stay. There are real physical limits on the power that's available on the planet to actually fuel this stuff.

The minerals that are needed to create chips. The unpolluted information that just exists in the world that can serve as a clean training data set. There are physical limits to this that we're already seeing AI run up against. And so I would say if you wanna prepare your kids for the future, teach them to understand that stuff.

Teach 'em to understand the environmental implications of AI. Teach 'em to understand what copyright is and how AI is just trampling people's rights and why that exists in the first place. It's so people can make a living, and then all of us can benefit from their creativity. And so there are opportunities to educate people, children, and adults too, frankly, about AI as a political artifact and as a social object that we're just totally missing in this, just, I just feel like brain dead rush to just put more kids in front of chat GPT and make them not do their schoolwork.

Christopher Saldaña: You've both already alluded to this, but there's this massive effort to drive the narrative that education technology is an imperative for schools.

That it has to be there for X, Y, and Z reasons. From the technological provider's perspective, why is it so important to get into schools? Why is there such heavy competition to get access to kids? I think for some folks it may be obvious, but for other folks it may not be so obvious. Why are kids relative to others?

Faith, you mentioned data's gold. Our kids it kids data different from adults, for example.

Faith Boninger: There's just so much money in schools. There's money to be made in providing the technology and then there's the money to be made with the data. Kids’ data everybody wants it 'cause kids of the future, so perhaps Andy would disagree with me, but, asking me off the cuff, I would say that the data isn't different in any particular way, except that these are young people. You have a whole buying life ahead of them. These are young people who can be influenced in their whole buying life, right?

You influence me. I got a few years left to be influenced and he's got a few more. But these kids, they're little. We know that the kinds of decisions that they make at a young age and the kind of influences that they have at a young age influence them for the rest of their lives.

Andrew Liddell: And to add to that, I completely agree with everything Faith just said, but even to add to that, you think about just building brand awareness and brand loyalty. What does it do to, to you to just be looking at the Apple logo every day or the Chrome logo every day there are these they call 'em network effects, but essentially when you get looped into a system, it's hard to extricate yourself. If you have a Google email address that you use for everything, it makes it very hard switching costs, right? It makes it hard to switch from that to other services. And yeah, I really just can't emphasize enough the opportunity to collect information and influence behavior from a very young age and to really know in exquisite detail, a child’s psyche. And that's really what all of this is about. Our whole surveillance capitalist internet is just about trying to build a perfect picture of the inside of your mind and heart so that they can understand what are you going to do and then sell that prediction to willing buyers. That's really the heart and soul of our internet economy right now.

And I, I think that people don't always appreciate that. That's really the logic behind everything and that, not only is this the desire, but companies have ways of subtly influencing your behavior to make their predictions come true. And so the more time we put kids in front of this extractive, manipulative internet, the worse it is for them.

Christopher Saldaña: We've reached the point of recommendations. Recommendations. There are so many individuals involved in this process of implementing and operationalizing technology in schools. So I want to ask you about two groups in particular. What recommendations would you have for parents?

Andrew Liddell: I'll start because I we work mostly with parents and so if you're concerned about ed techs or asking questions of your school, and I think it's important to come at these conversations with some community.

And so if there are other people in your class, other parents in your kids' class, make friendships with those people and start getting on the same page with those people. The stories I told earlier, about the harms that befall young people at school. Those are very common. I think what's not common is that people speak out about them and that they don't, hide in shame or bullied in silence by their districts.

And so there are a lot of kids who have been pushed into things that are inappropriate for them to be seeing because of their school computer. And so it's okay. It's okay to speak out and speak up about that. And understand that this is not your fault as a parent or even the fault of your child because these products are designed in a certain way.

That's your choice is not free. You are not choosing where to go and what to see. And so when something bad happens to your kid on their school computer, it's not your fault. It's not their fault. There are these other actors there that are pushing them in that direction. And so the first thing I would say is, find, find your people, find your community. When you approach your school, do so with, an open mind, an open heart as a partner to say. School. You've been taking advantage here too. You're a victim here as well. You were sold this as a miracle cure-all.

But look at what's happening. Our kids' academic performance is falling, their mental health is suffering. Teachers are now having to basically play internet whack-a-mole to make sure that kids aren't on Snapchat and Fortnite when they should be paying attention in class. Do we really need this much?

Isn't there a better way that we could do this that would also allow kids to be taught explicitly about technology, which is a really important thing to do, and a really important conversation to have. And try to be a partner with your district. I'll tell you, it's gonna be difficult from my experience, the districts aren't always willing and ready to learn, listen to parents who object.

But it's okay to go ahead and start. And then if you're a parent who's looking for other school options, my, my last piece of advice is: Look what the rich guys do. Look at what the Silicon Valley tech titans do and the type of educations they prioritize for their kids, or the type of educations that they had the benefit of as young people.

And so when Mark Zuckerberg was at Phillips Exeter he had these roundtable classes of 12 kids where they just got to pontificate and discuss and had this Socratic instruction. Bill Gates, they're in Redmond as well. Same thing. You've got the people in Silicon Valley who are sending their kids explicitly to Waldorf schools that have no technology at all for their kids.

Because they know what the stuff is. They know what's, what's under the hood. And so try and make your choices as though you're being guided by them because I think the people who make this are in the best position to know what the harms and the risks are, and when they're trying to keep their own children as far away from it as possible.

That's a huge red flag.

Faith Boninger: There is there is power that parents have, when they work together. I'm thinking back to when the Gates Foundation was pushing InBloom for interoperability purposes. It was parents who basically united and created a movement and eventually the districts and states that had signed on to InBloom signed off.

That didn't happen. Everything. I just wanna second everything that Andy was saying, that it's important for parents to realize that as a community they can have influence and creating that community where they share what they experience is really important. Going further to teachers and schools and districts my colleagues and I including you, Chris, we've encouraged restraint in adopting programs in bringing products into schools.

We encourage schools and districts to really do a soul searching about what their vision for education is, what their goals are. And to think about how technology could advance those goals, how it could not advance those goals. And then they can then shop as a general rule or for particular products with that understanding of like, how could these products help us advance our goals or should we stay a hundred percent away from this stuff? It requires a lot of advanced work to do that, but I know districts do it too, and they can, and they should. And, to be honest, the argument that they don't have the bandwidth to be proactive and to evaluate products, then they shouldn't use them. Let's be honest, it's not, we have the bandwidth, we don't have the bandwidth, we don't have the expertise, we don't have the power, so we're just gonna do it. No, actually, if you don't have that then don't do it. And schools and districts and teachers need support from policy at a higher level.

They need their states to, to back them up, and not make every school, every teacher be the arbiter. So they need accountability and transparency and oversight.

Christopher Saldaña: I want to ask you about a second group who often are parents and grandparents too, policymakers. What recommendations would you have for policymakers?

Andrew Liddell: I would say just first pay attention. Just un understand what has happened and what has been allowed to proliferate in schools. And so if you're a representative or on the school board or what, whatever it is, you got kids or grandkids, just understand that you're letting your kids into a completely unregulated environment.

And all of the limitations that surround textbook adoption and curriculum adoption are completely absent when it comes to technology adoption. There is literally nobody in charge. You have companies whose only job is to make money finding a captive market, pushing their stuff into schools with no regulation and no oversight.

And so it's no wonder that people are getting hurt and performance is falling. So I would just beg policymakers to just please start paying attention.

Faith Boninger: Yeah, absolutely. And provide the regulation that requires transparency, requires oversight, requires accountability. That's the role of policymakers.

Christopher Saldaña: Thank you Andrew and Faith for being on this month's podcast. As always, we hope you're safe and healthy. And remember, for the latest analysis on education policy, you should subscribe to the NEPC newsletter at nepc.colorado.edu.