Experts recommend policies to enhance student data privacy in education technology platforms.
The surge in online learning prompted by the COVID-19 pandemic created a seller’s market for education technology companies. Long before the surge, however, a cultural shift in education had already encouraged the digitization of the industry. Now, many education leaders are scrutinizing the collection and use of student data as they seek to interpret, understand, and comply with privacy regulations designed to safeguard sensitive student information.
Under the Family Educational and Privacy Rights Act (FERPA) of 1974, educational institutions are prohibited from sharing students’ “personally identifiable information” without parental consent. FERPA regulations apply to all education providers receiving federal funds and are designed to protect students’ paper and digital education records. In 1978, federal lawmakers expanded the protections granted under FERPA through passage of the Protection of Pupil Rights Amendment (PPRA), which grants parents and students the right to opt out of federally supported surveys or evaluations that concern a number of protected topics.
The regimes established by FERPA and PPRA are implemented by the Education Department and principally address schools’ obligations. FERPA and PPRA rules do not apply to education technology companies—or so-called “ed-tech” firms. Passed in 1998, the Children’s Online Privacy Protection Act (COPPA), however, is enforced by the Federal Trade Commission (FTC) and prohibits operators of online services, commercial websites, and children’s apps from collecting and disclosing data about children under age 13 without parental consent.
As the use of ed-tech has continued to soar in recent years, industry leaders have looked beyond the three primary federal student privacy laws and turned to self-regulation as a means of protecting student data. In 2014, the Software & Information Industry Association and the Future of Privacy Forum developed the Student Privacy Pledge, an industry pledge whereby ed-tech companies make public statements detailing their student data privacy practices for the sake of accountability.
Although signing the Pledge is voluntary, the FTC can use companies’ public commitments to bring civil enforcement actions against any of the 400-plus signatories of the Pledge that fail to protect student data. Many critics insist, however, that such actions have yet to take place. Some advocates call for reinforcing traditional student privacy protections to address the growing digital education landscape.
In this week’s Saturday Seminar, scholars explain gaps in the regulation of student data privacy and propose methods to better protect student information and data.
- Current debates about student privacy fail to include increasingly popular online learning platforms that offer learning experiences directly to users, suggests Elana Zeide of the University of Nebraska College of Law and Helen Nissenbaum of Cornell Tech. In an article published in Theory and Research in Education, Zeide and Nissenbaum highlight how two types of platforms—Massive Open Online Courses (MOOCs) and virtual learning environments (VLEs)—are not included in the scope of student privacy regulation because they gather personally identifiable information directly from learners without school mediation. They argue that operators of MOOCs and VLEs should go beyond compliance with commercial data use regulations to preserve student privacy norms specific to the education sector.
- In an article published in the Duke Law & Technology Review, Alexi Pfeffer-Gillett of the University of Maryland Carey School of Law argues that education software companies are not complying with the Student Privacy Pledge. After analyzing the privacy policies of eight companies that have signed the Pledge, Pfeffer-Gillett explains that seven are in violation of at least one of the Pledge’s core promises. Apple, for example, collects personally identifiable information and engages in behavioral targeting of advertisements. Pfeffer-Gillett also notes that companies that have not signed the Pledge are not necessarily less compliant with the Pledge’s standards. Instead, he suggests that “the Pledge may be more valuable as a public relations tool than as a means of actually effecting … industry improvements.”
- De-identification of student data alone cannot adequately protect student privacy, Elad Yacobson of the Weizmann Institute of Science and several coauthors argue in an article published in the Journal of Learning Analytics. Using machine-learning algorithms to analyze and cluster unlabeled datasets, Yacobson’s team was able to re-identify personal information from de-identified student interaction data. Yacobson and his coauthors could even identify when a selected group of gifted children went on a school field trip. Noting that there is no “silver bullet” for education data privacy, Yacobson and coauthors contend that privacy technology must be accompanied by clear regulation and increased awareness among educators.
- In a research paper in Research and Practice in Technology and Advanced Learning, Tore Hoel of Oslo Metropolitan University and Weiqin Chen of Augusta University examine data sharing through a pedagogical lens. Hoel and Chen suggest three principles for consideration in educational data privacy policies. First, privacy and data protection should be achieved through the negotiation of data sharing with individual students. Second, educational institutions should be transparent with their decisions to access data, and such access should pass a necessity standard. Finally, schools and universities should use data sharing negotiations as an opportunity to increase data literacy.
- In a paper in the Virginia Journal of Law and Technology, N. Cameron Russell of the Fordham Center on Law and Information Policy and several coauthors identify a legal and regulatory gap in the sale of student information: existing privacy laws do not encompass the sale of student information by data brokers. Russell’s team advocates transparency in the commercial marketplace for student data. They argue that brokers should be required to follow procedures that promote data accuracy, such as an obligation to notify downstream data users of inaccuracies. They also favor providing opt-out clauses to parents and emancipated students and recommend that schools should inform students and families about how their survey results will be used commercially before administering them.
- In a chapter of the Cambridge Handbook of Consumer Privacy, Elana Zeide of the University of Nebraska College of Law argues that traditional student privacy regulations fall short in “an era of big data.” Zeide recommends best practices for education technology companies to cultivate trust amongst stakeholders, such as providing sufficient transparency and accountability. She also suggests upholding traditional expectations that students’ personally identifiable information will remain in schools rather than be sold to for-profit companies.
The Saturday Seminar is a weekly feature that aims to put into written form the kind of content that would be conveyed in a live seminar involving regulatory experts. Each week, The Regulatory Review publishes a brief overview of a selected regulatory topic and then distills recent research and scholarly writing on that topic.