Skip to main content

Hack Education: Behaviorism, Surveillance, and (School) Work

I was a speaker today at the #AgainstSurveillance teach-in, a fundraiser for Ian Linkletter who is being sued by the online test-proctoring software company Proctorio.

I am very pleased but also really outraged to be here today to help raise money for Ian Linkletter's defense and, more broadly, to help raise awareness about the dangers of ed-tech surveillance. It's nice to be part of an event where everyone is on the same page — politically, pedagogically — and I needn't be the sole person saying "hey wait, folks. This ed-tech stuff is, at best, snake oil and, at worst, fascist." 

The challenge, on the other hand, is to not simply repeat the things that Sava, Maha, Benjamin, Chris, and Jesse have already said. I am lucky that these five are not just colleagues but dear friends, and the love and support they have shown me and the solidarity that all of you show today give me great hope that we can build better educational practices and that we aren't stuck with snake oil or fascism. 

I will say this, even if it's been stated and restated a dozen or more times today: test proctoring is exploitative and extractive. It is harmful to all students, but particularly to those who are already disadvantaged by our institutions. To adopt test proctoring software is to maintain a pedagogical practice based on mistrust, surveillance, and punishment. To adopt test proctoring software is to enrich an unethical industry. To adopt Proctorio in particular is to align oneself with a company that has repeatedly demonstrated that it sees students, teachers, and staff as its enemy, a company that has no respect for academic freedom, for critical inquiry, or for the safety and well-being of the community that it purport to serve.

When we talk about surveillance and ed-tech, much of the focus — and rightly so — is on how this affects students. As we gather here today to raise funds for Ian, a learning technology specialist at UBC, we have to recognize, no doubt, how much surveillance and ed-tech affects teachers and staff as well. 

This should not come as a surprise. A fair amount of education technology comes from what we call "enterprise technology" — that is, the kinds of software adopted by large corporations and government entities. Zoom, for example, was not designed for classroom use (although Stanford University was, interestingly, its first paying customer); it was designed to facilitate remote business meetings. (And I should add that it was designed rather poorly for even that. This spring, when schools and workplaces turned to videoconferencing tools en masse, the CEO of Zoom admitted that he'd never really thought about privacy and security issues on the platform. He couldn't fathom why someone would "zoom bomb" a class.) 

Enterprise technology is utterly committed to monitoring and controlling workers, although I think many white-collar professionals imagine themselves laboring outside these sorts of constraints — much like professors seem to imagine that the plagiarism and proctoring software is only interested in their students' work, not their own. Microsoft and Google offer schools "productivity tools," for example, making it quite clear — I would hope — that whatever students or staff type or click on feeds into financial measurements of efficiency. (Quite literally so in the case of Microsoft, which now offers managers a way to measure and score workers based on how often they email, chat, or collaborate in Sharepoint. In ed-tech circles, we call this "learning analytics," but it's less about learning than it is about productivity.)

The learning management system, as the name suggests, is as much a piece of enterprise software as it is educational software. It too is committed to measuring and scoring based on how often its users email, chat, or collaborate. That word "management" is a dead giveaway that this software is designed neither for students nor for workers. The LMS is an administrative tool, designed at the outset to post course materials online in such a way that the institution, not the open web, could control access. 

The LMS monitors "student performance" and productivity, to be sure. But it can readily be used to surveil instructors as well — to track which graduate students are on strike based on their log-ins, for example, to track how many hours teachers are working during the pandemic. Real world examples, not hypotheticals here. The co-founders of Blackboard, incidentally, are back with a new startup: an administrative layer on top of Zoom to make it work "better" for school, they say, with features like attendance-taking, test-proctoring, and eye-tracking. Congratulations to education technology for taking a surveillance tool designed for the office and making it even more exploitative when turning it on students and staff.

I want to pause here to make an important point: too often, when it comes to education technology — all technology, to be honest — we get really caught up in the material (or digital) object itself. We talk about "the tool" as though it has agency and ideas all its own, rather than recognizing that technology is part of the broader practices, systems, and ideologies at play in the world around us. Technologies have histories. The LMS has a history — one bound up in the technological constraints of student information systems in the late 1990s and the ed-tech imagination about what sort of web access students should (or shouldn't) have. That history gets carried forward today — analog beliefs and practices get "hard-coded" into software, creating new pedagogical constraints that students and teachers must work with. Technology never arrives "all of a sudden," even though stories about tech and ed-tech are prone to ignore history to highlight some so-called innovation. Proctorio did not emerge, fully-formed, out of Mike Olson's head — as the goddess of wisdom emerged out of Zeus's, but here with less wisdom and more litigiousness. Online test proctoring has a history; it has ideology. Indeed, test-taking itself is a cultural construct, embedded into longstanding beliefs and practices about schooling — what we test, how we test, "academic integrity," cheating, and so on. And the whole idea that there is rampant cheating is a nifty narrative, one that's been peddled for ages and is truly the philosophical cornerstone for how much of ed-tech gets designed.

Surveillance didn't just appear in educational institutions with the advent of the computer. So many of the deeply held beliefs and practices of education involve watching, monitoring, controlling. Watching, monitoring, controlling children and adults, students and teachers. And I'd argue that watching, monitoring, and controlling are fundamentally behaviorist practices.

We can think about the ways in which education technology has emerged from workplace technologies — as well as the ways in which schools adopt the language and culture of "productivity" — in order to control our bodies, our minds, our time, our behaviors. But much of the labor of education is reproductive labor — teaching and learning as the reproduction of knowledge and culture, teaching and learning as acts of care. What does it mean to bring surveillance and behaviorism to bear down on reproductive labor? 

Wired ran one of those awful "future of work" stories last week. Something about how too many meetings make us unhappy and unproductive and therefore artificial intelligence will "optimize" all this by scheduling our appointments for us, by using facial recognition and body language-reading software to make sure we're paying attention and staying on track and maximizing our efficiency. The underlying message: more surveillance will make us better workers. 

It's not hard to see how this gets repackaged for schools.

But I'm interested not just in this question of productive labor but reproductive. These are sort of loosely formed ideas so bear with me if it's not fully baked. What does it mean to automate and algorithmically program reproductivity. What does it mean, furthermore, when the demands of the workplace and the demands of school enter the site more commonly associated with reproductive labor — that is, the home.

Of course, we have for almost one hundred years now been developing technologies to and telling stories about the necessity of automating the home. That's this other influence that joins enterprise tech: consumer tech.

Let me tell you a quick story that bridges my book Teaching Machines with the next product I'm working on — this one on the history of the baby monitor (I mean, we have a thing called "baby monitor" and we have bought this product for almost a century now and all of a sudden folks want to talk about how children are so heavily surveilled and good lord, read some history). Anyway, let's talk about B. F. Skinner's efforts to develop a behaviorist technology for child-rearing: the "air crib."

Let's talk about Skinner because we should recognize that so much of surveillance in ed-tech comes from his behaviorist bent — his belief that we can observe learning by monitoring behavior and we can enhance learning by engineering behavior.

Skinner fabricated a climate-controlled environment for his second child in 1944. First called the "baby tender" and then – and I kid you not – the "heir conditioner," the device was meant to replace the crib, the bassinet, and the playpen. Writing in Ladies Home Journal one year later, Skinner said,

When we decided to have another child, my wife and I felt that it was time to apply a little labor-saving invention and design to the problems of the nursery. We began by going over the disheartening schedule of the young mother, step by step. We asked only one question: Is this practice important for the physical and psychological health of the baby? When it was not, we marked it for elimination. Then the "gadgeteering" began.

The crib Skinner "gadgeteered" for his daughter was made of metal, larger than a typical crib, and higher off the ground – labor-saving, in part, through less bending over, Skinner argued. It had three solid walls, a roof, and a safety-glass pane at the front which could be lowered to move the baby in and out. Canvas was stretched across the bottom to create a floor, and the bedding was stored on a spool outside the crib, to be rolled in to replace soiled linen. It was soundproof and "dirt proof," Skinner said, but its key feature was that the crib was temperature-controlled, so save the diaper, the baby was kept unclothed and unbundled. Skinner argued that clothing created unnecessary laundry and inhibited the baby's movement and thus the baby's exploration of her world.

This was a labor-saving machine — see, we even talk about reproductive labor in terms of productivity — Skinner boasted that the air crib meant it only would take "about one and one-half hours each day to feed, change, and otherwise care for the baby." Skinner insisted that his daughter, who stayed in the crib for the first two years of her life, was not "socially starved and robbed of affection and mother love." He wrote in Ladies Home Journal that

The compartment does not ostracize the baby. The large window is no more of a social barrier than the bars of a crib. The baby follows what is going on in the room, smiles at passers-by, plays "peek-a-boo" games, and obviously delights in company. And she is handled, talked to, and played with whenever she is changed or fed, and each afternoon during a play period, which is becoming longer as she grows older.

This invention never caught on, in no small part because the title of that Ladies Home Journal article – "Baby in a Box" – connected the crib to the "Skinner's Box," the operant conditioning chamber that he had designed for his experiments on rats and pigeons, thus associating the crib with the rewards and pellets that Skinner used to modify these animals' behavior in his laboratory. Indeed, Skinner described the crib's design and the practices he and his wife developed for their infant daughter as an "experiment" – a word that he probably didn't really mean in a strict scientific sense but that possibly suggested to readers that this was a piece of lab equipment, not a piece of furniture suited for a baby or for the home. The article also opened with the phrase "in that brave new world which science is preparing for the housewife of the future," and many readers would have likely been familiar with Aldous Huxley's 1932 novel Brave New World, thus making the connection between the air crib and Huxley's dystopia in which reproduction and child-rearing were engineered and controlled by a techno-scientific authoritarian government. But most damning, perhaps, was the photo that accompanied the article: little Deborah Skinner enclosed in the crib, with her face and hands pressed up against the glass.

Skinner's call to automate child-rearing also coincided with the publication of Dr. Benjamin Spock's The Common Sense Book of Baby and Child Care, which rejected the behaviorist methods promoted by psychologists like John B. Watson — strict, data-driven timetables for feeding and toilet training and so on — and argued, contrary to this sort of techno-scientific endeavor, that mothers — and it was mothers — should be more flexible, more "loving," more "natural."

Now, there are plenty of problems with this naturalized domesticity, to be sure — a naturalized domesticity that is certainly imagined as white, affluent, and American. And some of these are problems that the teaching profession, particularly at the K-12 level, still wrestles with today — white womanhood. 

The air crib, psychologists Ludy Benjamin and Elizabeth Nielsen-Gamman argue, was viewed at the time as a "technology of displacement" – "a device that interferes with the usual modes of contact for human beings, in this case, parent and child; that is it displaces the parent." It's a similar problem, those two scholars contend, to that faced by one of Skinner's other inventions, the teaching machine – a concept he came up with in 1953 after visiting Deborah's fourth-grade classroom. Skinner's inventions both failed to achieve widespread adoption, they argue, because they were seen as subverting valuable human relationships – relationships necessary to child development.

But I'm not sure that either the ideas of the teaching machine or the baby monitor were truly rejected. If nothing else, they keep getting repackaged and reintroduced — at home, at work, at school.

The question before us now, I'd argue, is whether or not we want behaviorist technologies — and again, I'd argue all behaviorist technologies are surveillance technologies — to be central to human development. Remember, B. F. Skinner didn't believe in freedom. If we do, then we have to reject not just the latest shiny gadgetry and anti-cheating bullshittery, but we have to reject over a century of psychotechnologies and pedagogies of oppression. That's a lot of work ahead for us.

But if we just bite off one chunk, one tiny chunk, let's make sure Proctorio is wildly unsuccessful in all its legal and its business endeavors.

This blog post has been shared by permission from the author.
Readers wishing to comment on the content are encouraged to do so via the link to the original post.
Find the original post here:

The views expressed by the blogger are not necessarily those of NEPC.

Audrey Watters

Audrey Watters is a journalist specializing in education technology news and analysis. She has worked in the education field for the past 15 years: as a...