[NOTE: I was honored to be asked by the Virginia Association of Secondary School Principals (VASSP) to be a presenter at the School Law Institute of their annual conference. The Institute was on Sunday and I am posting the text of my remarks below, along with the slides (which are mostly just images). It’s a long post, but I hope it is worth your time and consideration.]




Thank you to Randy Barrack and VASSP and the Virginia Department of Education for inviting me to be a part of the School Law Institute today. I’m in the leadoff spot here, so I see my job as to be just good enough and to do whatever it takes to get on base so the heavy hitting attorneys behind me can bring us all home.

Speaking of attorneys, I need to offer a disclaimer here, one I do whenever I give a talk like this:

I, Jonathan D. Becker, J.D., Ph.D., am an attorney at law, but by education only.  That is, I have a law degree and I am admitted to the Bar of the State of New Jersey (not Virginia!).  However, I have never practiced law and I am currently officially “retired completely from the practice of law” in every jurisdiction.  Therefore, I cannot, among other things, “render advice on the law or legal assistance.” As a result, all discussions and communications related to this presentation or beyond should be considered academic in nature; I will answer questions and offer opinions (readily), but they should be considered answers and opinions based on my education and understanding of information, and not legal advice.

So, yes, I have lots of thoughts and opinions. In fact, here are some opinions, or things that I like.

I like chocolate ice cream. That’s apropos of absolutely nothing, but I just thought you would like to know that about me.

On to more relevant things…

I like technology. I have spent over 20 years researching, writing and teaching at the intersection of education and technology. And, personally, I admit I am a straight up geek. I like to play with new digital technologies in all facets of my life.

I like data and information. I am the guy who won’t buy a new pencil without getting as much information I can about the best and most cost-effective pencil. I am a voracious reader and consumer of information.

I like safety and especially school safety. I am a parent of an 8 year-old and a 13 year-old and I think all schoolchildren should be able to attend a school that is safe and a desirable place to be each and every school day. I know you feel the same way.

I like learning. You may have noticed that I have a law degree and a doctoral degree. When I was a kid, the National Basketball Association (NBA) ran commercials where star players exhorted us to “stay in school.” I took them both literally and seriously, staying in school until I was over 30 years old.

What happens when you put things you like together and mix them up? Sometimes the result is good, sometimes it’s bad. I mean, you can probably think of four of your favorite things to eat. Some combinations of those foods might be good, others not so much.

When I combine the four things on the screen, I come up with this claim:

New digital technologies massively increase our capacity for data collection and information management in ways that create incredible affordances for school safety and for student learning.

That strikes as a fairly unassailable claim. I’m happy to hear otherwise, but not now; maybe at the reception tonight you can tell me how I’m wrong. But, I have been involved in the field of educational technology for over two decades and I have seen some amazing possibilities. If I may borrow from Seymour Papert, one of the forefathers of educational technology, I have seen how technology-mediated learning experiences can result in hard fun for kids. Look at those kids on the screen. Isn’t that what we want for kids in our schools all the time?

So, let’s revisit my claim. Hard to argue, right? Well, I need to add a couple of caveats. I need to suggest a level of criticality by adding a “however” to the mix. New digital technologies massively increase our capacity for data collection and information management in ways that create incredible affordances for school safety and for student learning. HOWEVER, there are serious legal issues for us to consider at the intersection of technology, data, information, safety and learning. Also, new digital technologies massively increase our capacity for data collection and information management in ways that create incredible affordances for school safety and for student learning, HOWEVER, we need to approach this all with some caution; a certain level of criticality. We need a mantra; one that I try to teach my own children all the time:


Let me repeat that, because it will become a refrain for us this afternoon…


And that’s what I will focus on for the remainder of my time this afternoon. That is, my plan is to talk you through the legal considerations at the intersection of technology, data, information, safety and learning, and to help you think critically about the issues.

Are you ready for that? Are you excited?

I am proud to be doing this kind of work in the Commonwealth of Virginia. Legally, we do a pretty good job of staying ahead of the curve. In fact, we were the first state in the country to pass the Student Privacy Pledge Model Bill, with unanimous votes by both the House of Delegates and Senate. And in this last legislative session, our law with respect to student records (our FERPA law, if you will) was updated in some good (and some not so good) ways. I will speak to those laws more directly later on, but they provide a good legal framework for thinking about issues of data/information + privacy/security. Specifically, these laws speak to data collection, use and sharing. So, that’s how I will present the issues.


We collect a LOT of data in schools these days. We collect data about students and about faculty and staff. Most of those data are useful, important and harmless. One could make the argument that the student information system has been one of the most important educational technologies of the last couple of decades. Among other things, student information systems allow for a much more efficient operation of schools and allow school administrators to work smarter and not harder. And, the kinds of data we collect and store in student information systems are probably the kinds of data most people think of when we speak of data and schools.

We collect other kinds of data in schools, too. In the name of school safety, we have cameras posted throughout our schools. Those cameras collect lots and lots of data about the actions and behaviors of students and staff in our schools. Those data are stored on servers somewhere within schools and/or division offices. And, absent a silly decision to put cameras in places like bathrooms or locker rooms, these kinds of cameras are generally legally permissible. Also legally permissible in some cases is the collection of data about what is in our students’ bodies. The Supreme Court has upheld, against 4th Amendment challenges, drug testing of student-athletes as well as students who participate in extracurricular activities.

And, the 4th Amendment to the U.S. Constitution is the legal issue to keep in mind when we talk about data collection in schools, particularly collecting data using technologies like surveillance cameras and drug testing equipment. The 4th Amendment was added to the Bill of Rights of the United States Constitution to codify the right of people to be secure and protected from unreasonable searches carried about by the government or the state. Just a couple of days ago, the Supreme Court added to our 4th Amendment jurisprudence by issuing an important ruling limiting the right of government officials to search for data generated by our cell phones. I haven’t had a chance to read the opinion yet, but USC law professor Orin Kerr, an expert on these matters, wrote on Twitter that “[a]pretty good one-sentence summary of Carpenter is that you now have a right not to be monitored too much without a warrant.”

Schools of course, are different. Since the 1974 SCOTUS decision in TLO v. New Jersey, we know that courts are fairly deferential to administrators and understand the importance of safe schools. Thus, generally, schools do not need a warrant to conduct a search; they only need reasonable suspicion. Furthermore, so long as school administrators can satisfy a two-part test, their searches would be considered constitutional. A search must be justified at its inception, and a search must be “reasonably related in scope to the circumstances which justified the interference in the first place.” That’s the fairly clear and low bar to clear for schools to make a constitutionally acceptable search. And, for the most part, judges and courts of law have been pretty deferential to school administrators. The law may be on your side.

HOWEVER, this is where I think some caution and criticality are warranted.

Repeat after me:


Why do I say that? Well, consider some new technologies that are being utilized and that are collecting data in the name of school safety and student learning. In the wake of the more recent school shootings, some schools and districts are looking into facial recognition technologies as part of their security profiles. These technologies, however, have not been shown to be effective for these purposes, and they have also proved biased in some particularly discriminatory ways. I will say more about illegal discrimination later.

Make no mistake, scanning someone’s face is a search. So, using technology to collect scans of people’s faces will surely be subject to legal scrutiny, and it may even pass muster. But, in my mind at least, it raises the creep factor. If you can stand the pun, the “optics” of using facial recognition are not good right now. And, this technology is just the tip of an iceberg. Benjamin Herold of Education Week has been doing a terrific job of tracking (again, pun intended) the growing interest by school administrators in technologies that try to “measure, monitor, and mold students’ feelings, mindsets, and ways of thinking.” And it doesn’t stop there. We have technology companies now that want to help us collect brain wave data to monitor the attention of our students.

Remember, I love technology; I consider myself a technology advocate. But, call me crazy, but I kinda think that if we are interested in students’ feelings (and we very much should be), we could just ask them. And, if we are concerned with students paying attention, we could just look at them. There are certainly legal implications of these technologies, but for now, I will only say that just because we can (maybe!) collect those data by technology and store them on servers doesn’t mean we should.


We don’t collect data just to store them in student information systems. We collect data in schools to be used to improve student achievement and for lots of other important outcomes including school safety. “Data-driven decision making” is a phrase that took on steam a number of years ago, but it’s not going away any time soon. Books like this one continue to be written today.

And new digital technologies mean we have new ways and means to use data. For example, we have learning management systems. And, perhaps the hottest topic of the day is adaptive or personalized learning. Here we see a heuristic of adaptive learning where content and data run through intervention and adaptation engines, and predictive models send student data to dashboards so that, presumably, teachers or the computers themselves can “adapt” the content to meet the student where he or she is. And then there is personalized learning, adaptive learning’s close cousin. Everybody and nobody can define personalized learning these days, but nearly every definition involves a whole lot of “analytics;” algorithms that purport to describe and diagnose and prescribe and predict learning.

This seems all well and good and there can’t possibly be any legal issues here, right? Well, not so fast my leadership friends. I’ve been working through a really important body of literature recently and it’s all very important, compelling, and, well, disconcerting. One of the first articles I read was a piece by ProPublica about predictive policing. Similar to the idea of personalized learning with its predictive learning analytics, police departments across the country bought into software that claimed to be able to predict future criminals. Taking a number of factors into account, this software generated a risk score for people that were in the criminal justice system. That score was used during hearings and was factored into decisions about sentencing. But, when ProPublica got the data from the system, they found that “the score proved remarkably unreliable in forecasting violent crime: Only 20 percent of the people predicted to commit violent crimes actually went on to do so. When a full range of crimes were taken into account — including misdemeanors such as driving with an expired license — the algorithm was somewhat more accurate than a coin flip.” Also, the formula was particularly likely to falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants. And, white defendants were mislabeled as low risk more often than black defendants.

More recently, I have been reading through these books that are all essentially about this idea of “algorithmic discrimination.” Consider just the subtitles:

  • How big data increases inequality and threatens democracy
  • How search engines reinforce racism
  • How high-tech tools profile, police, and punish the poor

No educator can read these articles and books and not grow terribly concerned about the sorts of inequalities that might be built into our algorithms; our “predictive learning analytics.” Bias and inequality perpetrated by the government is what the Equal Protection Clause of the 14th Amendment is meant to stop.

What if, like the policing software, the algorithms that drive our new personalized learning programs discriminate against students of color? That would be a serious equal protection problem. We have serious equal protection problems in how we discipline students and in how we assign students to special education and gifted education programs. The Office of Civil Rights of the U.S. Department of Education has been sniffing around school divisions in central Virginia recently for this very reason. That is, there is reason to believe that students of color are being disproportionately disciplined and disproportionately assigned to special education.

Well, those are decisions made by humans, you say, and personalized learning software is computer-based. We know that humans can be biased, but computers are neutral, you add. Well, not exactly; humans program the computers.

Ultimately, it might be a reach to argue that how we use data in schools might be discriminatory and a violation of the Equal Protection Clause of the 14th Amendment. Maybe. But, read those books and then let’s talk. Let’s at least entertain the possibility and agree that as school leaders, you should keep an eye on how data are run through computers and algorithms in our schools and used to make decisions and that you should keep an eye on who ultimately gets sorted in what ways.

And, even if it’s all perfectly legal, I have to return to our refrain. Repeat after me:


Why do I say that again?

How about “educational genomics?” Yes, you read that headline right. Scientists are now looking to collect genetic data to help us personalize learning. Creeped out yet? What could go wrong, right?

Well, it’s precisely that question of what could go wrong that has me most concerned from a legal perspective. It’s one thing if there is a data breach and a hacker gets control of, say, student test score data. But, what if they get a hold of your genetic data which was collected, presumably, in the name of improving student learning?


And that brings us to the final set of issues around data sharing. Here, I should be clear, there are two possibilities: data might be shared intentionally, and data might be shared unintentionally. In either case, we have some legal issues to consider.

Schools share data intentionally for all kinds of reasons. We share lots of data with educational technology companies. As we speak, thousands of educators are gathered in Chicago at ISTE, the largest ed. tech. conference in this country. The vendor floor there is enormous and filled with companies that collect student data, often for good and important purposes. At the simplest level, for many of us, we share email data with Google or other companies. We share data intentionally for research purposes. Or, at least as an educational researcher, I hope you do. And, when you do, I hope we all follow the proper protocols to engage in an ethical exchange of data and information. There are many, many ways we intentionally share data with third parties in education.

And, data sharing, even when intentional raises some legal issues. Earlier, I mentioned a couple of Virginia specific laws related to student data privacy. We have a statute specifically related to student data and school service providers. It’s a comprehensive law with lots of provisions that mostly relate to the providers. That is, it mostly tells school service providers what they cannot do with data the collect. These are just a few of the provisions, specifically the ones related to the sharing of data. For example, vendors must have a comprehensive security program so that the data don’t get shared unintentionally. Also, vendors cannot turn around and sell any data they collect to other vendors. If you are not already, I strongly encourage you to become familiar with this law.

Also, during the last legislative session, Governor Northam signed into law some changes to our pupil records law. These changes are related to FERPA and FOIA. As you know, pursuant to FERPA, federal legislation, certain kinds of student information is considered directory information, and, assuming proper protocols are followed, that information can be disclosed to third parties. However, after House Bill 1 was passed and signed, as of July 1 of this year, certain kinds of information have been effectively taken out from under the directory information umbrella. Essentially, now, student addresses, telephone numbers, and email addresses cannot be disclosed to ANYONE without prior affirmative consent.

What does that mean? Well, consider that a few weeks ago, we got an email from the Provost of VCU that stated that, among other things, we couldn’t have student email addresses on any public-facing websites. So, if we had an administrative assistant who is also a VCU student, we couldn’t put that address on the Web as contact information. Additionally, and this one is really wild, if we are sending an email to more than one student at a time, we must put the email addresses in the bcc field. We can’t disclose a student’s email address to another student without prior consent. So, there are good intentions to the changes to the law, but there are practical considerations in carrying them out that are tricky. I have heard that at least one university is considering filing a lawsuit to get the law tweaked, so stay tuned.

Those laws are relevant when discussing intentional sharing of data and information. But, now let’s get a little darker. What about the world of unintentional data sharing? That’s a scary image, isn’t it?

Well, here’s an equally scary image. This is a K-12 cyber incidents map designed and maintained by my friend and colleague Doug Levin in Northern Virginia. Every time there is a school-based cyber incident, he adds a new pin to the map with some relevant information. What’s a cyber incident? Well, here’s what he considers a cyber incident. And how many pins are on the map? As of yesterday, since January 2016 (so almost 2.5 years), there have been 345 incidents that he has been able to track. And, if you look at the map, you’ll see that we are not immune in Virginia. Anyone from Powhatan here? Remember this

I won’t leave this slide up for too long. From what I can ascertain, the situation there was handled quite well. They followed proper protocol and handled a very difficult situation quite well. The U.S. Department of Education offers a really useful model protocol and training kit for data breach situations and I strongly recommend that you take a look at that at some point.

Data breaches are only one form of unintentional data sharing. They are certainly the most well known. However, Doug Levin also recently conducted some research about other forms of unintentional data sharing by schools. Doug looked at how secure and private websites of educational organizations are, including school, district and state websites. What did he find:

  1. Most state and local education agency websites do not support secure browsing, putting both schools and website visitors at risk;
  2. Virtually every state and local education agency has partnered with online advertising companies to deploy sophisticated user tracking and surveillance on their websites, quite extensively in some cases; and
  3. Many state and the vast majority of local education agency websites do not disclose the presence and nature of this ad tracking and user surveillance, or the mechanisms for how users can opt out of these data collections. Those few that do make such disclosures often do so in misleading ways, including by making demonstrably false statements about their privacy practices.

In other words, your websites are neither secure nor private. You are likely exposing your networks to hackers and potentially affording access to student data to advertisers. There is “a widespread lack of attention to issues of online security and privacy, and should spur prompt action at the highest levels of education leadership to remediate the many (and sometimes significant) deficiencies and lack of compliance…”

What are the specific legal issues here?

Well, there is a state law about notifying individuals when there has been a data breach. That law specifies the proper notification protocols. In the case of a data breach, at the very least, make sure you follow the proper notification protocols. And, make sure you go beyond that. As I mentioned, the U.S. Department of Education offers some great tools to help you prepare for this kind of situation.

One open legal question is whether or not organizations can be held liable for negligent enablement of data breach. That is, lawyers and data privacy advocates are looking at the possibility for holding organizations liable for data breaches under negligence standards. If that were to come to pass, it could mean huge financial burdens for organizations found liable. We haven’t seen much of this in the courts yet. Courts, however, have been issuing rulings on who can sue on other grounds and have, for now, not allowed lawsuits based on prospective harm. That is, if there is a data breach, you cannot bring a lawsuit because you are worried that you are at increased risk of, say, identity fraud. Courts are looking for actual, individualized harm. But, you can be sure that this issue of negligence will be discussed and litigated extensively in the near and long-term.


So, there you have it, a quick overview and survey of legal issues related data/information and privacy/security. From the U.S. Constitution to Virginia statutory law, there are legal issues to consider around the collection, use and sharing of data.

I’d be remiss if I didn’t leave you with some recommendations before I open this up for some questions. I don’t want to get too prescriptive, though, since you are the leaders and I’m just, as was said on a great episode of the Simpsons, “the law talking guy.” But, in addition to reviewing all of the resources I will share with you at the end of this presentation, I have a handful of suggestions:

  • Cybersecurity best practice at all levels 
  • Model good privacy practices – you are the leader in your buildings; children and adults look to you as a model of good behavior. You should learn as much as you can and start with simple things like using two-factor authentication wherever you can. 
  • Training/PD
  • Planning/Auditing
  • Trust your instincts – if a technology product/service looks too good to be true, it probably is.

Oh, and of course, I have one final recommendation. Remember…