Skip to main content

Sherman Dorn: Student Privacy and Social Infrastructure

Current debates over student privacy should remind us that infrastructure leaves legacies for decades, including policy and technological infrastructure.1 This weekend, the New York Times published a column by Sue Dynarski about one proposal to change student privacy laws, a proposal that would make it almost impossible to use administrative records for educational research. Dynarski identifies this problem proposal as S. 1341, a bill sponsored by Louisiana Senator David Vitter. Vitter’s bill is one of several in Congress addressing student privacy issues and currently has no co-sponsors–in other words, at the moment it is highly unlikely to become law.2 The provision that worries Dynarski is the requirement for active parental consent for the transfer of any educational records to a third party, and a separate consent process for each such transfer.3

At its core, Dynarski’s argument is that educational research on the effectiveness of programs is too important to wipe away with concerns about privacy. On some issues she is wrong. Dynarski wrote that she knows of no “Target-like” breaches of educational data. She works at the University of Michigan. I am guessing that its servers are under regular attack by hackers as are the servers of many other colleges and universities; there were several publicly-known breaches last year.4 I could also quibble with her claims about the “original purpose” of student data records.5 However, both issues are tangential to the central question of whether it is possible to protect student privacy today and still allow all of the types of educational research Dynarski and I value.

One problem is that we inherited the federal legal structure to protect student privacy from its creation in an era that was different from our in three fundamental ways: the original Federal Educational Rights and Privacy Act (FERPA) language in 1974 was motivated by a different problem, long before easy data transfers, and when most educational records were held directly by school districts, colleges, and universities. We can easily see today that technology creates new avenues for threats to data privacy. It is harder to see the other two differences.

The history of the original language makes clear that a primary concern motivating FERPA was not privacy in an Internet era: it was the regular practice of schools at the time that hid educational records from students and their parents. From the Congressional Record of December 13, 1974:

The purpose of the Act is two-fold–to assure parents of students, and students themselves if they are over the age of 18 or attending an institution or postsecondary education, access to their education records and to protect such individuals’ rights to privacy by limiting the transferability of their records without their consent…. Under the Family Educational Rights and Privacy Act, a parent is given the right to challenge the contents of his child’s records to insure that they are not inaccurate, misleading, or otherwise in violation of the stu- dent’s privacy or other rights. (vol. 120, p. 39862)6

FERPA is the federal law that established a fundamental right that you can see your own or your child’s educational records and, if they are in error, challenge the accuracy of those records. Congress enacted the language in an era when schools regularly classified students as mentally retarded without seeking the permission of families to assess children and without having to show parents the relevant data.7 FERPA also has prevented schools from casually sharing educational records without the consent of students or their parents. As is the case today, the early 1970s was a time when Congress scaled back some of the national-security surveillance state, and FERPA’s creation was partly in that spirit, but not entirely. FERPA did not eliminate the ability of law enforcement to request information from schools; on the other hand, it did prohibit sharing of information such as discipline and transcript records with potential employers or many others without student or parent consent.

After FERPA’s passage, school systems raised questions that suggest they had a casual approach to sharing information about students: while the Senate approved amendments to FERPA on December 13, 1974 (the date of the passage above), it is clear in context that school officials had to ask for clarification because many of them had never considered protecting student privacy to be something that required a broad and carefully considered approach. They scrambled in response to the law because they panicked, and they panicked because they had never prioritized privacy.

After that initial scrambling, schools created procedures to protect student records. Those procedures are the legacy of the law, a legal and policy infrastructure. Like roads, procedures are paths that schools and their employees can follow because it is easier and safer to do so than to drive in the territory of ad hoc decisions, where all sorts of accidents happen. Smart teachers and administrators like procedures because they want to avoid legal accidents in the same way that good drivers avoid accidents on the road.8

Here is the key point: All of the violations of good sense prevented by FERPA are actions related to records created and held by schools. That was fine in an era when records were non-digital or held only by schools and colleges. That era is over today, not only because of technology but because of the relationships that technology makes possible. Today, schools are not the only entities that create educational records: if your school or college contracts with a private vendor for some educational service, the private business is often creating educational records tied to individual students. If an individual teacher or faculty member uses a service such as Voicethread, Edmodo, or any of dozens of others, they are often creating records held on the servers of those private businesses. If an individual teacher or faculty member keeps records on their personal computer, and that computer is backed up in the cloud, some private party has that data. In some cases, but not all, school systems and colleges and universities require their vendors to meet standards that protect FERPA. In almost no case am I aware of do individual teachers and faculty read the Terms of Service for private services they take the initiative to use. The procedural infrastructure of FERPA was built for the era of paper records held by schools and colleges. It is not just technology that has ended that era. It is how technology has enabled different relationships between teachers, schools, and private businesses.

But education is not the only area in which privacy is threatened by how technology makes oversharing so easy. Ask everyone whose IRS or commercial records have been hacked. Nor is “technology for sharing” limited to computers: ask the family of Henrietta Lacks, whose ovarian cancer cells became the first cell culture to become broadly used without the consent of either Lacks or her family. So addressing student privacy is a tiny piece of the puzzle. Instead of focusing just on student privacy, we need to address privacy more broadly, with a general principle that can then be fleshed out by specifics for each area, whether it be medical records, financial records, or educational records.

One place to go might be the concept of multiple layers of rights to one’s data, where the layers can be tuned to specific uses. In 2009, MIT researcher Alex Pentland proposed what he called the New Deal on Data:

The first step toward open information markets is to give people ownership of their data. The simplest approach to defining what it means to “own your own data” is to go back to Old English Common Law for the three basic tenets of ownership, which are the rights of possession, use, and disposal:

  1. You have a right to possess your data. Companies should adopt the role of a Swiss bank account for your data.You open an account (anonymously, if possible), and you can remove your data whenever you’d like.
  2. You, the data owner, must have full control over the useof your data. If you’re not happy with the way a company uses your data, you can remove it. All of it. Everything must be opt-in, and not only clearly explained in plain language, but with regular reminders that you have the option to opt out.
  3. You have a right to dispose or distribute your data. If you want to destroy it or remove it and redeploy it elsewhere, it is your call. Ownership seems to be the minimal guideline for the “new deal on data.”

There needs to be one more principle, however—which is to adopt policies that encourage the combination of massive amounts of anonymous data to promote the Common Good. Aggregate and anonymous location data can dramatically improve society. Patterns of how people move around can be used for early identification of infectious disease outbreaks, protection of the environment, and public safety. It can also help us measure the effectiveness of various government programs, and improve the transparency and accountability of government and nonprofit organizations.9

Pentland is vague about the specific mechanisms to allow data sharing and use in ways that are consistent with the principles of ownership. What is the limit of disposal/distribution rights? I think I know where Europe is headed: people will have a right to dispose of their data, including telling Google to erase their data in web indexing. That would probably eliminate the viability of research using administrative data. Yet the idea of different layers of rights is a useful one, if this particular configuration is hard to put into action. If it can be figured out, it would establish a different kind of social infrastructure, something that might be more flexible, less tied to a specific era’s way of organizing information.

I have a suggested name for the bill that could carry this forward–not David Vitter’s bill by a long shot, but something that would (among other things) balance the rights of students and families against the fact that public investment in education exists because it is a public good. The name for that bill: The Henrietta Lacks Privacy Act.

 

Notes

  1. That is also a theme of a recent Backstory Radio podcast. [↩]
  2. It is not entirely clear who worked with Vitter to draft the bill–from the few websites mentioning the bill and some of the content, possibly the Home School Legal Defense Association and conservative activist Susan Effrem. [↩]
  3. This is the proposed LIMITATIONS ON THIRD PARTY USE section. There is another section that prohibits the use of federal funds for any data matching. That would not prohibit data matching with non-federal funds, but I think it would kill large dataset projects funded by federal grants except where each program is explicitly authorized by language specific to that program. [↩]
  4. Hat tip: Barmak Nassirian. [↩]
  5. The Progressive Era reformers who first collected and published reams of educational data in city school systems thought they were promoting improvement; one could argue they were doing a better job of promoting inequality by class, gender, and race, using data for those ends. [↩]
  6. The federal Department of Education’s website also has a brief legislative history of FERPA. [↩]
  7. Public Law 94-142 established procedural safeguards for special education; it passed the year after FERPA. [↩]
  8. Yes, there are educational equivalents of aggressive drivers, and Rick Hess’s “cage-busting” book series is a response to timid driving. The point here is not that all procedures are brilliant, but that they form a very human infrastructure. [↩]
  9.  Pentland, “Reality Mining of Mobile Communications: Toward a New Deal on Data,” in Soumitra Dutta and Irina Mira, The Global Information Technology Report 2008-2009: Mobility in a Networked World, Geneva, 2009, pp. 79-80. [↩]

This blog post has been shared by permission from the author.
Readers wishing to comment on the content are encouraged to do so via the link to the original post.
Find the original post here:

The views expressed by the blogger are not necessarily those of NEPC.

Sherman Dorn

Sherman Dorn is the Director of the Division of Educational Leadership and Innovation at the Arizona State University Mary Lou Fulton Teachers College, and editor...