By Sara Ganim
January 13, 2021
Below is a transcript of episode 12. We encourage you to listen to the episode. It was written to be heard, not to be read.
Sara Ganim Narration: Last episode we told you how the big, scary punishment for violating the federal privacy law called FERPA has never actually been enforced. No school has ever lost federal funding for giving out student information and the courts have said you can’t sue a school for violating FERPA either. So, law has no teeth, yet…
Matt Reed: The fear of being in trouble under this law seems to only get more intense every year.
Sara Ganim Narration: Even though over-applying the law has life-or-death consequences.
CNN News: We begin our program with broken hearts in yet another American town which today became the site of yet another deadly school shooting
Sara Ganim Narration: Take the school shooting in Parkland, Florida.
ABC News: At least 17 people have been killed, students and adults.
Sara Ganim Narration: Investigations that followed the shooting found examples of times where school officials voiced concern about the shooter’s behavior
ABC News: New reports highlighting security failures in the Parkland school massacre
Sara Ganim Narration: but failed to share those concerns, out of fear they might violate FERPA.
WRLM Report: Listed here, there was pretty significant warning.
Sara Ganim Narration: and that’s 11 years after that same kind of misunderstanding of FERPA happened at Virginia Tech
CBS News: What is turning out to be the deadliest campus shooting in US history.
CBS Evening News: The question is no longer if it will happen, as much as when it will. It’s been happening everywhere. I always felt like eventually it was going to happen here too.
WRAL: One problem, some in law enforcement say that schools are not giving law enforcement the information needed to guard against a mass shooting. Some of that fear comes from FERPA, federal rules that do not allow student’s education information to be shared in many situations. Orange County Superintendent Todd Wirt says that state needs to have an agreed definition.
“There needs to be a common understanding of FERPA and what can be shared and should be shared.”
“They can’t feel that fear, and I think that’s what’s happening.”
Sara Ganim Narration: The fear of FERPA. It seems to eclipse even our worst nightmare:
ABC News: *Panic* If I don’t make it, I love you mom.
Sara Ganim Narration: School shootings.
Sara Ganim Narration: From The University Of Florida’s Brechner Center For Freedom of Information, I’m Sara Ganim, and this is Why Don’t We Know.
Matt Reed: After the Parkland shooting, the interest in school discipline problems, violence on campus, weapons, got very, very intense.
Sara Ganim Narration: That’s Matt Reed, he’s the former FOIA officer for Florida’s Brevard County Schools who talked to us in episode 5 about what it’s like to be a public records officer.
Matt Reed: We got requests for the data on those things, school-by-school, from numerous news outlets as well as some activist groups and others. Well school-by-school, those numbers may not be very high. It may only be that four weapons were confiscated and three of them were pocket knives or something at an elementary school in one of our cities.
And there has been an interpretation by people in the education world that if the number is less than 10 of something, that someone somewhere would be able to put on some kind of magical hat and infer that one of those nine or eight or four incidents must be Johnny B. Bad. But even if it’s anonymized aggregate data that we by other law has to report to the state every year, we routinely told people, “No,” they couldn’t have this data because many of the numbers were less than 10.
Well, when I went to the sources for data in our department and to our district attorney, the answer was “That’s the rule. That’s what all school districts follow and that’s what we’re going to follow.” I did my job as a school administrator and followed their guidance because I was new to it. But at the same time, the reporter in me was “Where the heck did this guidance come from?” I Googled it and asked around and sent emails. If you know what it is, I don’t. Please tell me because I can’t find a law or some other piece of serious rule.
Sara Ganim Narration: The answer is, we really don’t know where this rule came from. But, we do know that over the years, The U.S. Department of Education has repeatedly told schools that they need to be careful about giving out what’s called small data sets. Essentially small numbers, out of fear. There’s that word again, that a small number could lead someone to identify the student involved. Let’s back up for a second.
We spent the majority of this first season of Why Don’t We Know talking about the failures of data and the weaponization of secrecy in higher education. But this is not just a problem at the university level. There are plenty of data gaps that are unique to primary and secondary education which in the education world, encompasses those first 13 years of school, from kindergarten to 12th grade. And so we sent out public records requests to the 50 state departments of education on a variety of topics, so that we could have conversations about data deserts at the K-12 level, this season, too.
Probably the most glaring deficiency we found was on one of the most important and difficult problems facing schools. When guns are brought onto campus.
We assumed that since school shootings are such a known public safety concern there would be really detailed data on this. We figured schools would keep a number, report that to the state department of ed, and then when we made our records request, we could see a spreadsheet of that data. But, that’s not a reality. Why Don’t We Know reporter Brianna Edwards explains what actually happened
Brianna Edwards Narration: So, when we first sent out our requests in the Fall of 2019, a lot of states either gave us no response or sent us one singular number — a statewide number of how many times a student brought a gun to school.
Sara Ganim Narration: But we wanted school-by-school data or at the very least, district-level data
Brianna Edwards Narration: Right, so we went back to those states a second time, and we asked them to provide the breakdown and that’s when we started seeing a different trend.
Sara Ganim Narration: That’s when this story changed a little from being a story about guns in schools to a story about FERPA.
Brianna Edwards Narration: In total, 26 states failed to give us data, and 14 of those said the reason why is because it would be a number so small, that it would violate FERPA.
Sara Ganim Narration: The small data set answer.
Brianna Edwards Narration: Yes, and some places were even reluctant to tell us the number of handguns versus generic weapons because being that specific could violate FERPA.
Sara Ganim Narration: Basically, what they’re saying is that they believe someone could reverse-engineer who brought the gun to school, thereby leading to a violation of privacy.
Brianna Edwards Narration: They’re afraid it would violate FERPA
Sara Ganim Narration: If you’re struggling to figure out how a number could lead to someone’s identity. Don’t worry, you’re not alone. Several people told us it’s pretty much impossible to do just from a number. So I asked Paige Kowalski, Executive Vice President of the Data Quality Campaign to tell me
Sara Ganim: Why do you think that small data set answer has survived so many years?
Paige Kowalski: it survives because it’s half true. FERPA does have rules around small cell sizes and what you can publish. But it really does depend on the denominator in that equation. Well, you can de-identify data. When you’re talking about a small amount of data but you will need other data to do it. So it’s not as cut and dry as we can’t tell you because the number is small.
Sara Ganim Narration: Here’s an example of a legitimate small data set concern — let’s say you ask a school how many red-headed students failed the end of year exam, and there’s only one red-headed kid in the whole school. Well, that number can’t be given out because that would give away that one specific kid’s grade. But in that scenario, there are two variables.
When we ask for things like, the number of guns brought to campus — that’s a single variable request. The information cannot be reverse engineered.
Brianna Edwards Narration: A data journalism professor who I talked to about this. He basically told me to think of it as a game of pin the tail on the donkey while blindfolded.
Being told that the donkey is a certain color, that isn’t going to help you because you are blindfolded. It’s a non-identifiable characteristic. If I tell you someone in your class has blond hair — that’s an identifiable characteristic. But if I tell you that someone in your class is an Elvis fan, that’s a non-identifiable characteristic. It narrows the field only if you already know that fact about that person. Otherwise, it won’t help you improve your probability beyond random chance.
Sara Ganim Narration: This is a problem with more than just guns being brought into schools. It’s been a very popular answer to requests for cases of Covid- 19 in schools.
Brianna Edwards Narration: Right, and in talking to experts about this and I learned that’s also flawed. It was explained to me this way, if one person on a 12 person basketball team tests positive for Covid, your chance of randomly identifying that person is one in 12, or eight percent. Knowing that someone is positive does not change those odds. That fact is due to, not math or statistics, but to logic.
Sara Ganim Narration: Here’s Paige Kowalski again.
Paige Kowalski: If there are 500 students in a university and there are 10 cases just saying the number 10 you’re not going to know and be able. That’s not enough information to figure out who they were. You would need a lot of other information to start even narrowing it down.
Sara Ganim Narration: Long before the pandemic, some places were using this flawed argument as a blanket rule for every request for data that was less than 10.
Take Vermont for example, when you go to their website, you’ll see a disclaimer that says:
“Notes on data suppression: to ensure the confidentiality of individual students’ results, public reporting of any assessment or accountability result requires a minimum of 11 students.”
Even those this is used for a lot of categories of date, I chose to highlight this in the guns in schools segment because it was such a common response and also because I couldn’t think of another time where it would be more relevant to safety.
From a parent’s perspective, someone worried about the safety of their school, there is a huge difference between one gun being brought into your kids’ school and 10 guns being brought in. Let’s add another layer of context. From Columbine to Virginia Tech to Sandy Hook to Parkland, we have struggled to answer the question of how to stop gun violence in schools. In at least two of those cases, Parkland and Virginia tech, FERPA was misinterpreted, causing red flags about the shooters to be missed. Those were fatal misunderstandings of the law and those shootings, they happened 11 years apart.
Eleven years later and the same FERPA mistakes are being repeated and now here we are in 2021, finding that in some cases FERPA is stopping us from knowing how many guns are coming into schools. Somehow, the fear of FERPA, it always seems to win out.
Tony Montalto: We’ve seen difficulties with schools handling FERPA information that seems that the fallback position is, well, if we don’t say anything, we can’t get in trouble, and that seems to be the position of most schools, or school districts when FERPA is discussed.
Sara Ganim Narration: That’s Tony Montalto. His daughter Gina is one of the 17 victims who died at Marjorie Stoneman Douglas High School in Parkland nearly three years ago.
Sara Ganim: What they’re doing is saying, well, the number is so small that you would probably be able to figure out who brought a gun to school if we told you the number. And they won’t hand over anything that’s less than 10. I was very surprised by that.
Tony Montalto: Well, I don’t think we’re surprised by the information.
Ryan Petty: As Tony said, not surprised at all.
Sara Ganim Narration: And thats Ryan Petty, he also lost his daughter, Alaina that day.
Ryan Petty: Typically, FERPA is used to cover a number of sins by school districts across the country
Sara Ganim Narration: Both fathers are founders of Stand with Parkland, a group of parents lobbying for safer schools
Ryan Petty: You look at any of these tragedies and what you see, almost without exception, the warning signs were there, there were indications that something was going to happen or that somebody was reaching out or crying for help before the incident of violence took place. And because of misunderstandings, misapplications or intentional intent to protect the reputation of the institution, or the district, or the school, that information isn’t shared with those that could take action to prevent the tragedy from occurring. It’s a recurring theme
Sara Ganim Narration: After Parkland, the Federal School Safety Commission put out guidance
Ryan Petty: a redeclaration, if you will, of what FERPA is supposed to cover.
Sara Ganim Narration: He calls it a redeclaration because it wasn’t new information, it was essentially the same thing that officials said after Virginia Tech
Ryan Petty: It was never meant to include the sorta behavioral records or the disciplinary records in some cases that it’s used to cover.
Sara Ganim Narration: It’s just that, they had to say it again because confusion and ignorance persists.
Ryan Petty: Some districts are starting to wake up to that, but most still use it as a convenient excuse to not report or inform parents about what’s going on in campus.
Sara Ganim Narration: This new guidance does not address the small data set issue that 14 states continue to use to mask the number of guns in their schools and it doesn’t address instances like this.
Last year in Alaska, school officials failed to notify parents when a loaded gun was found in a backpack, citing FERPA. Instead they assured everyone that “No one was in danger.”
In Washington State, the year before, frustrated parents lashed out at a school that refused to tell them why a student arrested for bringing a loaded gun to school and allegedly making threats to use it was allowed to return.
Sara Ganim: if a handgun was brought into my kid’s school nine times, it seems like from a public safety perspective, that’s more important than this fear of FERPA. As a parent myself, I would want to know. Do you think that’s a valid assessment or am I completely wrong here?
Ryan Petty: As a parent on the other side of a tragedy, where I wish that I had known something was going on at the school, I can 100% agree with your sentiment.
Sara Ganim Narration: Tony And Ryan told me this small data sets issue
Tony Montalto: This is the first we’ve heard of it. You’re obviously doing some great work here to expose issues that haven’t been seen before.
Ryan Petty: It’s the first we’ve heard of the small set problem, but it’s not surprising because there are a number of excuses that districts use to not inform the public or state agencies about what’s going on on campus. These are public schools, I think they have a duty to report to the public and particularly the parents of those students that go to those schools.
Sara Ganim Narration: Plus, there’s this
Ryan Petty: The point of reporting to the state agencies, the point of reporting to the federal government is so that resources can be deployed to help that school or that district with the problem that they may be having. And if they’re underreporting or not reporting insight and privacy issues, then that can’t be resolved.
Sara Ganim Narration: This is why accurate data matters. It’s the reason why we don’t know.
If you’ve been listening this season, you know that we have two parallel themes going in this podcast. We’re focused on the weaponization of privacy and also on data deserts trying to unpack why there are problems that keep recurring that we can’t seem to fix and how a lack of information allows those stories to keep happening to keep repeating themselves.
When we started exploring issues impacting K-12 schools, we knew that we wanted to dive into the data that is compiled at the federal level by the U.S. Department of Education.
Our public records requests deal with data at the state level, but at the federal level, there is a database that’s updated every two years and it looks at a variety of topics that impact K-12 schools.
Why Don’t We Know reporter Gabriella Paul took a look for us.
Gabriella Paul Narration: This database is called the civil rights data collection, or the CRDC and basically, how it works is once every two years the department of education conducts a survey. They’ve done this every year since 1968. It’s supposed to act as a “master” database on a slew of topics – from enrollment numbers to instances of weapons on campus to allegations of sexual misconduct. Ideally, the data should exist in this one central place for everyK-12 school in the country.
Sara Ganim Narration: That sounds like it is an amazing resource.
Gabriella Paul Narration: Well, ideally. Except that this data set is notoriously inconsistent. Back in 2019, even the department admitted so much Betsy Devos who was President Trump’s Secretary of Education announced that the department was taking “unprecedented steps” to improve the data’s quality. Fast forward one year, in October of 2020, the latest report was released and Devos claimed the data was quote “more reliable than ever.”
Sara Ganim Narration: and, was it?
Gabriella Paul Narration: No, far from it. The department does have data quality checks in place. Those “unprecedented steps” Devos was talking about back in 2019 but they’re not correcting the bad data. The department is knowingly putting out incomplete data.
Sara Ganim Narration: Which is the opposite of what they brag about in the press release.
Gabriella Paul Narration: Right, and the reason I found out is while I was inspecting the department’s report, I ran across their “data notes,” where they quietly disclosed some problem with the quality of their data.
Sara Ganim Narration: The press release says the data is better than ever and the footnotes say otherwise.
Gabriella Paul Narration: Yes, basically, these data notes tell the real story that the department sought to survey nearly 98 thousand schools across the country and even though 99 percent of them responded, the department then ran nearly 40 complex queries on the data to flag any anomalies: things like outliers and false zeroes where zeroes were entered instead of nulls which is a placeholder for data that’s not tracked.
Sara Ganim Narration: The difference between zero and null is zero means “there were no actual instances.” For example, of a gun being brought to school. Null is “we don’t track that at all, so we don’t have a number for you.”
Gabriella Paul Narration: Yes. So, at first, nearly 90 percent of data were flagged for possible errors.
Sara Ganim Narration: Wow – that’s a huge number.
Gabriella Paul Narration: It is, but it’s not problematic in itself. I mean, it’s a good practice to catch quality issues on the front end if you can fix them.
Sara Ganim Narration: But, did they?
Gabriella Paul Narration: Well, the short answer is, they fixed some of them. About 40 percent either fixed their data or submitted justifications for any wonky numbers and for the rest, the department basically ran out of time and published the data. To paint a picture there are whole datasets that are mostly zeros and obviously, we know that’s not accurate information. Overall, 70 percent of the data still has persisting quality issues.
I asked Seth Galanter about this… he used to work for the department of education under the Obama administration in the office for civil rights. it was his job to oversee the collection of this data every other year. This is what he said:
Seth Galanter: You’re always balancing that the more you clean the data, the longer it’s going to take to get the data out, so there’s always going to be some trade offs. I think that there’s tons of room for improvement. But I think that the balance that is being struck is a fair one.
Gabriella Paul Narration: Right now there’s a two year lag between data collection and releasing. so the 2020 report shows data from the 2017 and 2018 school years.
Sara Ganim Narration: It sounds like it comes back to a lack of manpower to make sure that schools are submitting good data.
Gabriella Paul Narration: Absolutely. Instead to make things worse,the department has adopted two pretty sketchy practices of data editing called “perturbation,” and “data suppression.” Perturbation is the random manipulation of data in the name of privacy. The data notes say “in order to prevent the disclosure of identifying information, most data in the public-use file have been privacy protected by making small, random adjustments to the data.”
Sara Ganim Narration: So they are changing the numbers to protect privacy even though it’s just a number it’s not identifiable since it’s a single variable but they change the numbers anyway?
Gabriella Paul Narration: Right, and the other way they manipulate the data is data suppression. So, this is a completely new thing that they started doing during the Trump Administration. Certain numbers that the department feels need to be suppressed are replaced with a stand-in code. The code is “-11.” So as you’re looking through the data, you’ll see these -11s pop up, and it’s basically like a redaction. In fact, the department’s data notes say the original numbers are in a restricted use file, or only for the eyes of department officials.
Sara Ganim Narration: Practically speaking, can you give us an example of what this means?
Gabriella Paul Narration: For example, take the high school in Parkland, Florida which had that terrible school shooting in 2018. In their department of education survey, the school accurately reported ‘yes’ to both having a shooting and a homicide occur on campus between the years 2017 and 2018. But then in the category for instances where someone had possession of a gun on campus the data is suppressed. Meaning even at the high school where students died at the hands of an active shooter, the students and parents of that school aren’t permitted to know exactly how many times a gun was reported on campus during the years when the fatal shooting occurred. Douglas was one of four high schools in Florida to show “-11” for the count of instances of a firearm on campus. About 3,800 schools reported “zero” instances, 96 reported one, including my hometown high school in Tampa, FL. A total of 12 other schools reported between 2 and 4.
Sara Ganim Narration: Is it possible the -11 is a stand-in for the “small data sets” issue that brianna was talking about earlier? That the department is worried that the number is too small, so they replace it with this code instead?
Gabriella Paul Narration: No, I don’t think that’s the case because there are many times when small numbers, like 2 thru 9, are reported in the database. When I asked the department of education, they told me that “reserve codes are not used to mask small data cells.” So we are kind of left wondering why these -11 codes show up. In terms of small data sets though, it could be that the decision to report them is left up to the states or even the districts, we just don’t know.
Sara Ganim Narration: You also asked the department of ed for a more general interview about this database.
Gabriella Paul Narration: Yes, I asked them for an interview so we could talk about all of this inconsistency and they declined to get on the phone and talk about it. But they did write some email responses, saying “the CRDC data depends on accurate collection and reporting by participating districts, and the department cannot speculate as to reasons behind the data reported.” In other words, the department of education claims it’s just the messenger of the district’s bad data.
Sara Ganim Narration: But even that doesn’t really add up, right? Because the data at the federal level doesn’t match the state level data that we gathered ourselves through public records requests.
Gabriella Paul Narration: Yeah, so I wanted to do a comparison, we have all this data we gathered and then we have this data from the department of education on the same topics. The numbers should match up, right? What I found was anything but. I sampled data from 16 states, from Georgia to Nebraska to Maryland. For all 16, we got direct figures from the states straight from the horse’s mouth.
For example, in Colorado, they told us there were 44 instances when a student was found with a gun on campus in the years 2017 and 2018. I expected that the federal database would report the same number, 44, but instead, the number was 36.
Now, to be fair, there were some states, like Alaska, that reported very close figures. The state reported 10 cases and the federal data shows 11. But then there were cases like Connecticut, where the state reported less than 5, and the federal data reported almost 60 incidents or the opposite, where the state’s number was sky-high and the federal one was oddly low. Like in Georgia, where the state reported a whopping 401 firearm possessions but the federal data reported under 200. I looked at 16 sets of data we gathered from states and none of them matched the federal data.
Sara Ganim Narration: This is for lack of a more eloquent term, a giant mess. I feel like we’ve actually come out of this knowing nothing at all.
Gabriella Paul Narration: All we know for sure is this data is really unreliable. To put it in perspective, I talked to Seth Galanter about this. He’s the former Department of Education employee who I spoke to earlier. I asked him if he personally had confidence in the data set, he said this:
Seth Galanter: There are data elements, where I would be confident looking at state and local numbers and national trends, summed up, but the individual school reports or even school district reports, I would, you know, exercise some judgment, before just accepting them.
Sara Ganim Narration: Unreliable data, inconsistencies, this is the story of Why Don’t We Know. And when it comes to guns in our kids schools and why we sometimes are told we can’t know, well, the reasoning behind that is inconsistent, too. Here again is Why Don’t We Know reporter Brianna Edwards talking about what she found in the gun data we collected from each state.
Brianna Edwards Narration: Going back to small data sets as an excuse for why we can’t know. The reality is that even though 14 states cited FERPA and small data sets and refused to give us numbers, nine states did give us data. They had no problem giving us numbers under 10. So clearly, states are interpreting the law in different ways.
Sara Ganim Narration: When I asked Ryan Petty earlier about school shootings and how making progress toward safer schools is hindered by FERPA he told me there are three things he thinks could really help. Two of them we’ve heard quite a bit:
Ryan Petty: Clarification in the law or in the administrative rules are important. The second thing that needs to be done is Training
Sara Ganim Narration: But the third thing he said is something that hasn’t gotten a lot of traction with lawmakers and that is holding officials personally accountable for hiding critical information.
Ryan Petty: The state needs to have some ability to sanction districts and district leaders when they don’t report accurately to state and federal agencies.
Sara Ganim Narration: I told him how our analysis of the federal data showed huge failures.
Sara Ganim: 70% of it remains incorrect, it’s just, they don’t have the manpower to force schools and districts to submit accurate information.
Sara Ganim Narration: And he told me that the grand jury in Florida. Investigating what happened at Parkland has recommended sanctions or worse for reporting bad data.
Ryan Petty: And then it goes further to say, the Florida Department of Education should have at its disposal, sanctioning authority sufficient to coerce school districts to comply with these state laws, which is really interesting, they go on to say, which may include withholding state funds, fines, censure referral for criminal charges and/or the removal of recalcitrant school officials.
Sara Ganim Narration: Next time on Why Don’t We Know
Paige Kowalski: Overall there just hasn’t been a lot of motivation or political will to try to move FERPA.
Sara Ganim Narration: If there were any other law in the U.S that was this confusing it would have been changed by now.
Ryan Petty: Even the most dedicated public servants, or legislators, politicians, their eyes glaze over when you start talking about things like FERPA
Sara Ganim Narration: So, why has a bad law persisted for four decades.
That’s next time. But first, next week, we’re going to drop a few ‘extra’ episodes about K-12 data deserts. We’re looking at a topic you’ve heard a lot about, like virtual learning during Covid and bullying in school and also one that isn’t so well known — violent attacks on teachers.
They will drop over the next few weeks, along with our final episode focused on why FERPA has not been fixed at the legislative level.
This episode was written and produced by me, Sara Ganim, with additional reporting by Brianna Edwards and Gabriella Paul.
Additional records requests were done by Ethan Magoc, Dana Bryan, Valentine Botero, Dana Cassidy, Molly Chepenik, Zachariah Chou, Kennedy Davis, Lia Diapolo, Sophie Feinberg, Melissa Hernandez, Brianna Moye, Nicole Needles, Yiwen Nui and Lina Ruiz
The Associate Producer is Tori Whidden.
This episode was edited by Amy Fu and James Sullivan.
Music for this episode was composed by Daniel Townsend.
Audio mixing was done by James Sullivan.
The Executive Producer is Frank Lomonte.
‘Why Don’t We Know’ is a production of the Brechner Center For Freedom Of Information At The University Of Florida.
A special thanks to the Hearst Family Foundation for proving the grant money that supported this reporting.
For more information, please visit our website at WWW.WHYDONTWEKNOW.ORG.
Editor’s Note: since the reporting for this episode concluded, an additional state told us that it could not provide data because it fell into the category of a FERPA small data set, bringing the total number to 15.