Skip to main content

Answer Sheet: How Corporate Interests are Overtaking Well-Intentioned Goals of Personalized Learning

It’s been years now that we’ve been hearing about how “personalized learning” is the new thing in education. Actually, it isn’t.

In 2013, George Wood, then the superintendent of the Federal Hocking Local School District in Ohio and chair of the board for the Coalition of Essential Schools, wrote this:

“Personalization” and “engagement” seem to be the new catchwords in education reform these days. Too bad the concepts are not credited to the person who first talked about them. It was the late Ted Sizer — in the Common Principles that he developed for the Coalition of Essential Schools — who pushed for personalized learning environments that engaged young learners. These days, he is seldom mentioned when these terms are rolled out.

There may be a good reason for that. As polite as Ted always was, I think he might object rather passionately to the way “personalization” is being tossed around today.

I was reminded of this while attending a workshop this past week on using technological tools to “personalize” student learning. The speaker felt that it might be appropriate to give examples of such personalization. He cited the way Amazon offers you other books or items you might like based on your last purchase; how Apple will customize the computer you purchase; or how the grocery store offers you coupons at checkout for your favorite items.

This isn’t personalization — it is just marketing.

All these years later, the marketing has only increased, and there are real consequences for schools and students, as Faith Boninger and Alex Molnar of the National Education Policy Center at the University of Colorado at Boulder explain in this post.

The policy center has just published an analysis — by Boninger, Molnar and Christopher M. Saldaña — about this issue, which you can see here. The post below gives the highlights.

Boninger is the co-director of the National Education Policy Center’s commercialism in education research unit. Molnar is publications director of the National Education Policy Center and co-director of the NEPC’s commercialism in education research unit.

By Faith Boninger and Alex Molnar

The vision of personalized learning is enticing. Happy, self-motivated students following their own interests at their own pace, mentored through individualized learning pathways by teachers empowered with data detailing their progress.

This idyllic vision of personalized learning is aggressively promoted by millions of philanthropic dollars, amplified by intense tech industry lobbying and marketing by third-party vendors.

As a result, personalized learning programs are proliferating in U.S. schools even as parents have begun to argue that they are being adopted without going through a proper educational vetting process. Students are resisting spending more time with screens and less time with teachers. Students and parents alike are objecting to low-quality curriculum materials and violations of privacy.

[Why parents and students are protesting an online learning program backed by Mark Zuckerberg and Facebook]

This dissatisfaction doesn’t surprise us. Our research finds that schools adopting tech-heavy personalized learning programs may indeed be threatening both the integrity of the education they provide and the privacy of their students and teachers.

We also found that the personal, child-centered vision of personalized learning is, in practice, being overtaken by corporate interests that emphasize data collection by digital platforms and software.

To understand what is happening, a good place to start is by examining the tech-friendly reality behind by the child-centered rhetoric of many personalized learning programs.

This reality is exemplified by the “working definition” of personalized learning offered by the Bill & Melinda Gates Foundation (and its associated organizations) to schools implementing personalized learning programs. Despite explicitly mentioning technology only once, the Gates vision of personalized learning creates an implicit imperative to use digital technologies because its view of learning and competence requires that massive amounts of data be relentlessly collected and analyzed.

When schools adopt digital education technology products to manage all these data, they cede important school and teacher functions to third-party technology vendors.

Using these products means that, as B.F. Skinner wrote about his teaching machines in the 1950s, “one programmer [is brought] into contact with an indefinite number of students.” We would add that using these products also makes programmers and the corporations that employ them, rather than local educators, responsible for decision-making about curriculum, assessment, and student progression. These decisions thus become programmed into and come under the control of the algorithms that drive proprietary software.

The dangers of relying on algorithms to make consequential decisions about people’s lives in such domains as employment, career advancement, health, credit, criminal justice and education have been well documented by, for example, Cathy O’NeilFrank Pasquale, and RAND Corporation researchers. These authors note that although algorithms are commonly thought of as purely mathematical and objective, in reality they are not. They reflect the myriad choices made by their developers and are thus value-laden and vulnerable to significant and difficult-to-correct error.

Hidden behind a proprietary curtain, the assumptions, perspectives, ideologies, and related social-class biases of the people who develop digital personalized learning products are concealed from review and critique.

The more sophisticated their software becomes (i.e., the extent to which it is adaptive and/or based in machine learning) the more profound and far-reaching the implications of the concealed bias become. Unlike textbooks or school board meetings, which are subject to public review, the public has no clearly defined right to review the algorithms used in education settings. Neither students nor their parents or teachers can examine how important decisions are made about their educations and lives.

The algorithms used in education technology products, such as the popular, well-funded and heavily promoted products we have reviewed (NearpodCanvas and Pearson Schoolnet), present additional threats to student and teacher privacy as they collect and store terabytes of data.

There are few legal safeguards to ensure that these data are not inappropriately used, shared, or properly protected from theft. In order to protect against built-in biases and threats to privacy, the programming of digital platforms used by schools should be reviewed by disinterested, third-party experts with a variety of perspectives, who would be tasked with flushing out their biases and exposing their susceptibility to being misused.

There is little if anything in the research literature that makes tech-centric personalized learning programs worth the risk of implementing them without proper safeguards in place.

The RAND Corporation research often cited in support of personalized learning actually found only small differences between personalized learning and “traditional” settings. Teachers in settings not explicitly defined as “personalized” or “competency-based” are as likely as their counterparts in personalized-learning settings to engage in a variety of personalizing activities, such as tailoring instruction to student needs and discussing individual learning progress with their students.

Teachers in both types of settings struggle to provide students choice of the topics and materials they study. Personalized learning settings cannot be distinguished from other educational settings when it comes to promoting so-called “21st century skills,” such as creativity, collaboration, and communication.

Interestingly, research on blended learning (i.e., settings that use both digital and nondigital pedagogical approaches) also offers little evidence that favors the use of blended learning in K-12 environments. It does suggest, however, that when blended learning outcomes are successful, it may very well be because of teachers’ efforts and not the technology used.

The lack of research support notwithstanding, digital products that effectively privatize curriculum and instruction, learning management, and assessment in schools has the potential to make a great deal of money for the companies that create and sell those products and for the investors who invest in them. For this reason, tech-centric personalized learning products will likely continue to be aggressively marketed.

Currently, educators do not have the policy tools necessary to enable them to critically evaluate tech-centric personalized learning products. Given the general lack of oversight and accountability, we recommend that policymakers declare a moratorium on implementing personalized learning programs until rigorous review, oversight, and enforcement mechanisms are established.

We further recommend that states establish an independent, disinterested, government entity responsible for evaluating the pedagogical approaches, assessment, and data collection embedded in digital personalized learning products.

Among the responsibilities of this independent entity would be to require that the curriculum materials and pedagogical approaches written into the education technology software used in personalized learning programs be externally reviewed and approved by independent third-party education experts to ensure that they are appropriate for intended student populations.

Further, the independent entity would ensure the security and privacy of student and teacher data by developing, implementing, and enforcing a standard, legally binding, transparent privacy and data security agreement to which all enterprises collecting student, teacher, and other data would be subject.

Without such explicit oversight of the details of how personalized learning products are designed and used, schools are at risk of compromising both their public mission and their students.

Establishing rigorous oversight is complicated and lacks the flash of the vision of personalized learning so heavily promoted by Silicon Valley. However, if personalized learning is to have any chance at all of fulfilling its promise and not harming students, it is essential that the first order of business now be developing regulatory, oversight, and accountability mechanisms — not promoting implementation.

This blog post has been shared by permission from the author.
Readers wishing to comment on the content are encouraged to do so via the link to the original post.
Find the original post here:

The views expressed by the blogger are not necessarily those of NEPC.

Valerie Strauss

Valerie Strauss is the Washington Post education writer.
,

Faith Boninger

Faith Boninger is NEPC's Publications Manager and Co-Director of NEPC's Commercialism in Education Research Unit. She brings to her research a backgroun...
,

Alex Molnar

NEPC Director Alex Molnar founded NEPC with Kevin Welner in 2010. He is a Research Professor at the University of Colorado Boulder and also co-directs the Commerc...