NEPC Talks Education: An Interview With Andrew Ho About NAEP and the Federal Role in Education Assessment
University of Wisconsin‑Madison Assistant Professor Christopher Saldaña interviews Andrew Ho about the National Assessment of Educational Progress (NAEP), educational measurement, and the federal role in education and assessment.

Transcript
Please note: This transcript was automatically generated. We have reviewed it to ensure it reflects the original conversation, but we may not have caught every transcription error.
Christopher Saldaña: Hi everyone, I'm Chris Saldana and this is the National Education Policy Center's Talks Education Podcast. This month we're speaking with Andrew Ho, who is the Charles William Elliott Professor of Education at the Harvard Graduate School of Education. He's a psychometrician whose research aims to improve the design, use, and interpretation of test scores in education policy and practice.
In this month's podcast, we talk with Dr. Ho about the National Assessment for Educational Progress, or NAPE, also known as the Nation's Report Card. We also discuss with him the federal role of education, both in assessment and broadly in K 12 public schools. So for listeners who read or maybe hear about NAPE on the news, and aren't too familiar with what NAPE is or how it works what is NAPE?
And what were your reactions to this year's results?
Andrew Ho: Yeah. So we often like to talk about NAPE as like the most important test you've never heard of. Those in the educational research community tend to know it well, but those out there in the general public. Tend not to, and I think that's mostly okay.
Most parents and citizens are concerned about educational status and progress of their kids and their neighbors and their communities. But every once in a while, they might want to know. Whether their kids are comparable in proficiency and the kids in their community are comparable in proficiency and growth to kids and other communities throughout the country and to previous generations of kids, which is to say, have we as a country advanced in our educational progress at all over the past 50 years?
And that's an important, I think it's an important question for us to be able to answer as a country and as citizens. And that is why NP exists. As you say, it's the nation's report card. It is a survey, not a census. So imagine, two to 3000 kids in each state every other year or thereabouts is sampled to take this test.
And it's in reading and math in grades four and eight. And it gives us a. National and state level and large districts. Some 27 28 odd large districts. Answer to the question. Where do our kids stand in these districts in these states and in the country? Every, Two or so years.
There is what we call NAPE day which is a wonky holiday for people who are like, Hey, we should know if we've again made educational progress, if educational inequality is increasing or decreasing. And of course, during the pandemic this has taken on particular urgency because we're asking, wait a second, have we lost a generation?
educational progress in the past two to three years? And are we advancing toward recovery? Not to mention what could accelerate our recovery? What should we be doing better? What could states be doing better? What could districts be doing better? So NAEP holds the answers to all of those questions.
And it's something that I've dedicated a lot of my career and service commitment too. I served on the governing board that oversees NAPE for eight years from 2012 to 2020, and I continue to be very much involved.
Christopher Saldaña: You mentioned two terms, proficiency and growth. What do those mean and how do they differ?
Andrew Ho: Yeah. For anyone who has kids or anyone who knows kids you can ask how tall they are and you can ask how much they've grown and same the same goes for education. If I've, so my girls are in fifth grade and they're in eighth grade and I can ask them, how well are they reading right now?
What kind of math are they able to do? And that's what we call proficiency. What is their snapshot in time? Status right now or last year or the year before. Now very obviously as educators and as parents, we're not just interested in where they are. We're interested in how fast they're growing and what we can do about their future growth.
And if you think about. Education, right? What education is about is growth and progress. That is the point that we could be doing more so that our kids could be learning faster. And so there are, I think two important questions. The first leads into the second, right? Where do we stand?
That's great. But where do we come from? And where can we go? That's growth. And most of us because we remember grade school and getting a single grade on a single exam and we get compared to other kids, we're like, Oh, how do I stand now? How do I stand now? But the educators perspective and the policy makers perspective should be less about where one stands than how one can grow.
What can we do to accelerate your learning in the future? So it's the national assessment. of educational progress. That's what NAEP stands for. N A E P. And the P is really important. Have we changed as a country? Are we are our kids can they read better now than they could before the pandemic?
The answer is no. So that change is important. It's not just where they are. It's, is that an increase or a decrease over prior years and prior generations?
Christopher Saldaña: So how did students do this year?
Andrew Ho: So again, I, as what the national assessment of educational progress, what NAEP does well is answer questions about progress, right?
So again, it's not just, Hey, our school's good or bad. It's our school's getting better or our school's getting worse. And it's not just about schools is the other secret to it, right? What NAEP really measures is educational opportunity writ large. It's it's not just schools. Like my, one of my favorite statistics is what percent of kids waking hours are spent in school.
So what percent of kids, so I want everyone to listen to that and think about an answer. So of all the time that kids are awake through a school year, what percent of those waking hours are spent in school? And the answer is 20%. 20 percent of their waking hours are spent in school. So if we're talking about where like learning comes from, it's not just school and any parent, anyone who knows kids knows that they are learning all sorts of things outside of school, not just communication and collaboration and social skills, but also academic skills as well.
They're people are kids are reading. They're talking to their parents. They're engaging with others. They're on YouTube and any number of social media outlets that I'm sure I'm not familiar with and they're reading and they're engaging and they're learning to write. And all of those are happening outside of school hours.
So what measures that I think of it as not just schools but educational opportunities and to answer your question, right? No, we're not doing as well as we used to. And more importantly educational. inequality is growing. So the story that Nate helps to tell over decades and not just decades but generations is one of general increasing educational opportunity, particularly through the aughts and the early 2010s, followed by stagnation and increasing inequality that was then exacerbated.
There were declines and an increasing inequality through the pandemic. So the pandemic is actually less to me about test score declines. It's more, and that's true and absolutely a concern. It's about the increased inequality that we see thanks to NAPE. And if we didn't have NAPE, we wouldn't be able, shockingly, we wouldn't be able to answer that question on a national scale.
Christopher Saldaña: So you've mentioned. opportunity as something that we should be paying attention to when we think about results. What other kinds of things do you think about when you see, for example, headlines like, our school's failing, NAEP results are down? How would you encourage folks to make sense of these results?
Andrew Ho: I think we can hold sort of two, two truths, right? So first What measures is the sum total of educational opportunity that our kids have and again, 20 percent of it comes in school and 80 percent of it comes out of school. Let's not forget too, that a lot of educational opportunity is afforded to us in our early childhood when there is no formal schooling, right?
So all of that builds in sums and cumulates over time and over age. So it's definitely not just about schools. But the second truth is, what do we have control over as citizens and as policymakers at a local and state and federal level? It's schools. Schools are where we have a fairly strong lever and where you see us able to shift the trajectories of students in positive ways.
So yes schools can't be the be all end all, but yes to they are where we might be able to make a difference, which is why it's this double edged sword where, we hold schools to the highest standards, even though they're only 20%. And we do that because because that's where we have the opportunity to actually make a difference.
So there's going to be absolutely cynical actors who use this to, to, to pillory their local schools and their, and local school teachers. And there are also people who like recognize that schools are where we have an opportunity to make a difference. And both of those. I don't know, camps or those perspectives are what leads to NAPE Day being sort of an opportunity for us to craft an evidence based narrative about about what we should be doing better and more of as a country when it comes to educational progress.
Christopher Saldaña: So there's an additional wrinkle that I think folks find confusing sometimes, which is the presence of state tests.
Andrew Ho: Yes.
Christopher Saldaña: So how do state tests Differ from Nate.
Andrew Ho: So there I'd say I'd say the most important two differences to highlight are first that state tests have individual student test scores.
My girls get scores on their state tests every year. And I look at those scores and those scores have stakes. Thereby they are made meaningful. We make them meaningful. Teachers make them meaningful. Sometimes they have stakes at the individual level. They can also have stakes and do have stakes at the school level when it comes to school classification and reporting school status and growth and progress.
So state tests have they have individual student scores. And the second piece is that again, they have high stakes. NAPE has no student scores. And you're like, wait a second, the kids took a test. Surely they have a score. And it's just actually, no, they just took a survey. They just took a few questions out of a massive number of questions they could have had.
So my daughter actually got to take NAPE last year as a fourth grader. And so she was one of the lucky people sampled and it was, Fascinating how she perceived Nape, right? Because it's almost like you get lucky enough to take this assessment and this assessment doesn't count. You just get to answer questions, right?
That are interesting and unique. And and she got, some Nape headphones that she's still shows off. And she's Hey, look at my Nape headphones. And so the vibe around NAPE is very different than the vibe around state tests, which because of the individual student scores and the actual stakes on those scores, those two features make state tests very different.
As you know well from your work too, that we, we actually can classify NAPE as a different kind of assessment. entirely from state tests. We call it a monitoring test, right? A monitoring assessment versus an accountability assessment, right? And so it is a different kind of assessment, entirely survey based, no individual student scores and no high stakes.
Christopher Saldaña: So I want to shift a little bit just to ask you about your work with the Educational Opportunity Project at Stanford. So for folks who aren't familiar with the Educational Opportunity Project what is it and what's it trying to accomplish?
Andrew Ho: Yeah. So it started, over 15, 16, 16 years ago.
Now I was a visiting scholar at Stanford. Between my time at Iowa and my time now at the Harvard Graduate School of Education. And I was there at Stanford and Sean Reardon my colleague who leads the educational opportunity project with me and Aaron Faley and Ben Scheer. We had a conversation in his office about.
Solving a problem of assessment on DH. It started off as a methodological kind of very quantitative project on DH. It's built over time when we've realized just how much we can achieve when it comes to understanding educational opportunity over time on a cross place throughout the United States. So I describe my work, at least on the psychometric and assessment side in the educational opportunity project as a stitching project on DH.
I might sort of Mixed metaphors. It relates to your previous question about nape versus state tests. So think of state tests as as a microscope, right? So each state you have this sort of microscopic level of detail about it. Each student's proficiency in each school across the state.
So think of that as a microscope. But there are 50 microscopes. So each state has its own microscope and it's looking at its own sort of state, but the microscope is different from, Massachusetts to Colorado to California across all 50 states. Now, what is NAEP? Nape is like a telescope, right?
It's you can zoom out and see the big picture of the entire country, but it's blurry. It's really blurry. Microscope is like in each state and seeing the fine details. And then you got 50 of them. And then you got nape, which is like this. telescope, which is just like it's really blurry, but at least I can see the big picture.
So now what do you want? You obviously want to be able to see in fine grain, clear detail, everything in the United States. And so what we did through the educational opportunity project, and that was, we call the Stanford education data archive or SEDA is create a stitching or a way to link the.
Pictures that each of the 15 microscopes takes with the map that NAPE provides to give us from 2009 to 2024, right? So 15 years of data over time and across place the scope and spread of educational opportunity when it comes to test scores, to be clear. So it's not the be all end all, but the scope and spread of educational opportunity in the United States.
It's been a very rewarding project because it's allowed me. To harness some of my more quantitative and narrow skills to offer a data product to everybody. So all of you can go to ed opportunity. org right now and download the data yourself and play with the data and look at what, your status and growth we were talking about, proficiency and growth in your schools and your districts are like from 20, from 2009 to 2024.
Christopher Saldaña: In that work we've talked a little bit about the pandemic. How has that work helped us think about what you talked about before, what Nape tells us about the pandemic and its impact on students.
Andrew Ho: Yeah, so that's exactly right. So nape again the scores just came out a few weeks ago. So that told us right at a high level that, again, we're continuing to see declines were cut. Recovery is not fully there. We're not back to our 2019 levels, but let's look closer, we don't know we're seeing 50 states and we're seeing a few large districts.
Wouldn't it be great to understand just how much variation there is. And once you look at the map and again go to at opportunity. org and look at the map. You're like, wait a second. Some districts have made substantial recovery. And other districts are really declining. And there's a socioeconomic gradient to it that is deeply unfortunate, right?
That is something again, increasing inequality and along socio economic lines. So that's what the linkage of the microscopes to the telescopes enables us to see allows us to zoom in with clear vision on just how much information there is. Inequality and variation there actually is. Now, what does that enable us to do that enables us to ask?
What are some of these districts doing that others are not? It allows us to look, for example, at the work that's come out from, Tom Kane and Aaron Faley and Sean wrote and looking at spending on seeing that. Yes, the federal recovery dollars that were allocated towards education did indeed make a difference.
And that, for every 1000 in spending per student, we have a 0. 01 standard deviation, which is a decent amount when it comes to when it comes to progress and growth we have 0. 01 improved standard deviation improvement in math and maybe around half that in reading, which is to say, yes, money matters.
And this builds upon other research that we know from Carabao Jackson and others. The yes, spending does increase educational opportunity when applied in thoughtful ways as it was during the pandemic when we had a real crisis on our hands.
Christopher Saldaña: So we've already talked quite a bit about a test that it that is administered by the federal government and data that is published by the federal government and there have been some changes that the current administration incoming administration has made to the U.
S. Department of Education and more broadly across departments and agencies within the federal government. So I want to ask you about the Department of Education for folks who aren't familiar. Maybe we can start with just a brief synopsis, if I don't know if it can be, but what does the Department of Education do and not do?
Andrew Ho: Yeah. So this is a. A very timely question. So timely that I'm worried that when people are actually listening to this podcast, anything I say will be extremely dated. So I should say it's February 25th now. And things are moving rapidly and chaotically in the federal government right now to give you a sense, the longstanding commissioner of the national center for education statistics, Peggy Carr was just placed on.
Leave last night. Again, this is February 24th. This is February 25th. And any number of executive actions or or potentially legislative actions or judicial actions could be happening between now and the time this podcast comes out over the time that that anyone might be listening to it.
So with all that said, your question is a good one. What does the. Department and federal Department of Education do. Why is it useful? Why is it important? So again, it was founded in 1980 by Jimmy Carter, and it is, an organization, an institution that consolidates a large number of federal efforts related to education.
One could imagine the Department of Education disappearing and all of those efforts still remaining. for listening. So it's important to, to step back from this and ask the right question is probably what should the federal role in education be? Should there be a federal role in education?
So when it comes to dollars, for example the, The federal contribution to two dollars is about 10%. It's all state and local funding primarily. So we're just talking about, the top that last 10 percent of funding, but that funding is very important. And relates obviously to again, federal efforts to To reduce poverty and help kids with disabilities.
So if you want to ask are we going to continue offering title one funding to help students who are, financially socioeconomically disadvantaged in poverty. Have better educational opportunities. Most people would say, yeah, that seems like a pretty important role of the federal government.
Should that be in the department of education? Should a department of education, is it helpful bureaucratically to have a department of education actually take that along with. the money for students with disabilities along with Pell Grants that help kids who can't afford it go to college. All of those separate efforts are probably what we should be thinking about and asking about.
NAEP obviously falls under this heading where we have a National Center for Education Statistics that helps to, to implement NAEP, NCES, National Center for Education Statistics, long predates the Department of Education. It could still exist without the Department of Education, right? So also, do we care about the Department of Education?
We probably should be asking, is it a useful organizational structure to consolidate the federal role in education, right? And to people out there, who are thinking more politically just look at what it does and ask if all of those individual efforts are worthy, right?
To, fund, to improve educational opportunities for kids with disabilities, for kids who are in poverty. Most people tend to think that's a right thing to do. And then the question is just bureaucratically. Does it help to have an institutional structure that consolidates that, but it's probably more than that, isn't it?
It's probably pretty symbolic to say no. Like the federal government does have this commitment. To education. That's not just over here in the Center for Statistics and over here in, the Health and Human Services, right? Education itself is this priority for the federal government. And that's the sort of symbolic fight that I think is going on.
Obviously, in our work, we do work in educational research. You can imagine educational research being in the National Science Foundation or something, right? What, but does it help to have a. Dedicated Department of Education that has a dedicated set of funds to support educational research and the educational sciences, you're preaching to the choir here and, or I'm preaching to the choir directly to you and potentially to some of your listeners to say, yes, I do believe personally it should be part of the federal government's charge and bailiwick to have these educational commitments And the Department of Education can probably efficiently represent and manage that.
And that therefore the Department of Education should exist. But there are more fundamental questions about what we believe, what you and I and others believe the federal role in education should be.
Christopher Saldaña: There has been this conversation about the reallocation of responsibilities across departments or agencies.
There's also questions around, for example, the Institute of Education Sciences, and some of the changes that have been made there in terms of canceling contracts and it's still very unclear. And there's so much uncertainty around what that actually means or what's going to happen going forward.
If you were to think about hypothetically what it would mean if an IES kind of structure were to disappear and if that were to be seen as something that's quote unquote wasteful, what might be the long term ramifications of that kind of decision? Yeah, it's it's dangerous prognosticating when so much is is in flux.
Andrew Ho: I. Believe and have had direct personal experience, CETA, which is and the educational opportunity fraud project has benefited from funds from the Institute of Education Sciences. It, that do I, and do you, and do citizens believe that it is important to know, right? How educational opportunity has.
Changed, again across the 12, 000 districts in this country. And over the past 15 years, we'd go farther if we had better data maybe you should, we should think about it in those terms that again, without NAEP, for example, I said we had 50 microscopes. What if there were no telescope, we have 50 different pictures and no one to be able to communicate across them.
Think too, about the fact that. State tests. We were contrasting state tests and NAPE. You know what NAPE's job is, again? It's the National Assessment for Educational Progress. It's job is to measure and assess progress. You know what state tests Often forget to do they forget to measure progress.
They break their trends. They change their tests. They change their standards. About 10 States or so can't even answer the question. Have we made progress or declined since the pandemic? So one thing I think that, yes, you could say it's NCES. What does IES do, right?
We like shine a light. That's what research does, right? We increase generalizable knowledge. We take the darkness and we give it light, right? And that, and should there be an institute, should there be a department that does that for education? Obviously I believe that there should be. And this is the disagreement that.
That some have. This is the one way to say it is this is the opportunity you have and we have to talk about the value of educational science of educational research of data, right? And I do find that across, party lines and from, for those with different political perspectives, if you ask Should we know whether our kids I've had the same educational opportunities as I had 25 years ago and 50 years ago, like Nape has.
Bipartisan support. And that, to me, is a reminder that when you start talking about the issues when you start talking like, should there be education sciences? And should, and then, is there a federal role in upholding them? We tend to find more common ground than we might if we just ask, should there be a Department of Education, right?
What does it do? How does it help society and help our kids? And that's what I hope. That's what I hope. We can make a case for in the months and years to come.
Christopher Saldaña: So we're at the point of recommendations. And I want to ask you recommendations along two lines. One is around the use of test results.
What recommendations would you have for folks at home as they think about, okay, here's a state result. Here's a NAEP result. How they use these results and incorporate them into their voting decisions, into their thoughts about policy change. And then the second line of recommendations is for policymakers.
What would you encourage policymakers to have top of mind as they think about the changes that are happening under the current administration?
Andrew Ho: Easy questions. So my general advice to to members of the public who deal with test scores is to, I say, beware the three fallacies of test score interpretation.
And I described the three fallacies as like numbers are more meaningful, more precise and more permanent than they really are. Most people believe that numbers in particular test scores hold incredible power. That a test score, is the be all end all more meaningful, right? More precise, right?
That once I give you a 700 or a 25 or whatever it is that is branded on our arm for all we turn precise, right? And then permanent too. That's what I mean branded right that it doesn't change We can't change it and this goes back to the proficiency and the growth discussion we had earlier people like What's the difference between proficiency and growth?
And they might have succumbed to the fallacy that once you have a score, it can't change. So they're like, what growth? Like, all I need is proficiency because once you have a number, it can't possibly change, can it? So again, those three fallacies, numbers are more meaningful, more precise, and more permanent than they are.
I think. Encapsulate the misinterpretations, the common misinterpretations we have of test scores and schools, right? That is a bad school. That is a bad teacher. That's making, again, numbers more meaningful, more precise, and more permanent than they are and should be. So the general advice is, and maybe you could put it this way, like hold numbers lightly.
So hold numbers lightly. Don't make them the be all end all. Supplement them with other forms of quantitative and qualitative information. Hold numbers, especially test scores, lightly. And don't succumb to the three fallacies that numbers are more meaningful. more precise and more permanent than they deserve to be.
And you and I as assessment experts know this, but the general public generally does not. And I'd say we still sometimes, right? Like I'll speak for myself. I am like, Ooh, a 33. It's as, and I have to ask myself, what does that mean again? How precise is it? This is our job to, to describe how.
Precise something, how precise the number is and then how permanent is, how can we change it? How does it change? So that's my general advice for general listeners is remember that your, our, my tendency is to look at a number and think it's more meaningful, more precise, more permanent, hold it lightly.
Remember that it's, that there are other numbers that could exist, that there's imprecision in that number and that number can change and you can change it and we can change it as a society. As far as policy makers again, don't ask a psychometrician. Don't ask a testing expert. We just make the numbers.
I will let other people interpret them. No, I, I think that we have to maybe it's maybe I'm putting it less on policymakers than I'm putting it on us as scholars and assessment experts. We have to learn to talk to them and speak their language. We have to recognize that we're that forcing them to speak our language is Is not a long term solution towards policy relevance.
And we need to understand the constraints that they face the constituents that they have and think about solutions and ways to talk about evidence that helps them navigate what is increasingly a sort of polarized policy space. And for me, it took me forever to realize this.
The levers of power and policy were not something I was acutely aware of because it wasn't my area of study. But understanding where power comes from, where it exists and how to change it is is what I think academic researchers are typically unless you Actually study that pretty bad at perceiving and acting towards.
I'd love to say, Hey, policymakers pay more attention to evidence policymakers don't use NAEP scores to bash schools. So I I, I'm. Instead of just casting those those wishes into the air and knowing that they're not going to get taken up, I try to think as a scholar and as a communicator how to engage them.
Here's one very. Specific example. It's called if you look at opportunity dot org again, you'll see that we report our results in terms of grade levels. Now you and I know that there's a long psychometric history and debate about grade level reporting. In general, we as a field do not agree with that.
Recommend reporting in terms of grade levels. Why? Because if my daughter who's in fifth grade gets a score back that says she's at the sixth grade level, what do I say? She should get promoted. She should be. She should go up to sixth grade. It is a terrible idea for individual school and individual student level reporting.
But you know what? If you want to capture a sense of what the declines or gains over the past, few years have been through the pandemic and you say, Oh, we lost 0. 18 standard deviation units. That's not going to mean it. And I can tell anything to anybody. And if I can have a back of the envelope approximation, right of that relates to metrics we're familiar with, which are what I call dollars, digits, counts, right?
Like we think in turn we think we know money, we think we know counts on we think we know time durations, right? So like those those metrics are familiar to folks. If we can engage them in terms of those metrics, then I think we can make progress communicating to more that more people than just our immediate scholarly neighbors.
Christopher Saldaña: Thank you, Dr. Ho for being on this month's podcast. As always, we hope you're safe and healthy. And remember for the latest analysis on education policy, you should subscribe to the NEPC newsletter at nepc. colorado.edu.