logo

California Poised to Enact Thoughtful Education Privacy Law

From Issue 1.53 - By Aaron Rieke
Aaron Rieke  -  

California legislators passed a bill last month that would limit how technology vendors serving K-12 schools use data. Although more than 30 states have introduced bills related to student privacy and security, California’s bill is unusually comprehensive and well-considered.

The law is “a response to growing parental concern that sensitive information about children — like data about learning disabilities, disciplinary problems or family trauma — might be disseminated and disclosed, potentially hampering college or career prospects,” reports the New York Times.

It covers educational “operators”, a term which is broadly defined to include websites, cloud computing platforms, and mobile applications used primarily for K-12 purposes. This includes companies that deal with data about students’ grades, attendance, discipline, family life, biometrics, and even predicted behaviors (such as whether a student is likely to perform well).

The bill places boundaries around educational operators’ use of student data. For example, it prohibits them from using student data for targeted marketing or building “profiles” about students (except in furtherance of educational purposes). It also requires them to maintain reasonable security measures and delete students’ data if the school district requests it.

But the bill also avoids being draconian, leaving education firms room to research and innovate. It does not rush to ban particular kinds of data collection. It allows operators to use data for “legitimate research purposes … under the direction of a school, school district, or state department ….” And it provides flexibility concerning use of data that is aggregated and de-identified (for example, statistics about how classes perform year by year).

In short, the law offers a reasonable baseline, helping ensure that schools won’t have to negotiate important legal limits on a case-by-case basis. And it provides privacy protections that are much stronger than those in federal law. “The California statute is filling the void,” said Joel R. Reidenberg, a professor at Fordham Law School.

California State Senator Darrell Steinberg told the New York Times that he hoped the bill would resonate beyond school settings. “The bill sets a standard that is applicable to the larger privacy debate,” he said.

California Governor Jerry Brown has not taken a public position on the bill. If he does not act, it will become law at the end of the month.


Advancing the Conversation on Big Data and Civil Rights

From Issue 1.53 - By David Robinson
David Robinson  -  

data On Monday, the Federal Trade Commission (FTC) hosted a workshop exploring big data’s impact on vulnerable communities. In preparation for the event, we here at Robinson + Yu published a new report entitled Civil Rights, Big Data, and Our Algorithmic Future.

The Report provides detailed examples of how and where big data becomes a civil rights issue. As Wade Henderson, President & CEO of The Leadership Conference on Civil and Human Rights, explained in his foreword to the report, when it comes to an issue such as body-worn video cameras for police, “[y]ou might call this a big data issue. Or you might say it’s about criminal justice reform. Whether we use the language of big data or civil rights, we’re looking at many of the same questions.”

A number of other groups also offered their thoughts at the workshop, including an industry-backed group called the Center for Data Innovation, which favors more corporate data collection about consumers. It introduced the concept of “data poverty” to describe the possibility that lagging technological adoption might make big data’s benefits less available to some communities, and called for stronger funding to the Census and American Community Survey.

The dearth of information currently available to civil society and the FTC itself regarding commercial big data practices was a major theme throughout the event. Several speakers, including FTC Chairwoman Edith Ramirez, cited academic research by FTC Chief Technologist Latanya Sweeney, which documented racially discriminatory patterns in the delivery of advertisements for criminal background checks, as the best available public example of automated, objectionable racial discrimination by a big data system.

A redoubled, collaborative effort by researchers and advocates to document the impacts of big data is one natural next step in this low information environment.


Also of Note: September 17, 2014

From Issue 1.53 - By Aaron Rieke
Aaron Rieke  -  
  • Facebook’s real name policy is “disproportionately affecting the LGBTQ community—in particular drag queens,” argues the EFF.

  • Google fielded 32,000 government data requests in the first six months of 2014, according to its most recent transparency report. This number is up 14 percent from the previous six months and 150 percent since the company first started publishing statistics in 2009.

  • The FCC may soon consider net neutrality-like rules for mobile broadband. “We are very concerned about the possibility that some customers are being singled out for disparate treatment even though they have paid for the capacity that is being throttled,” said FCC Chairman Tom Wheeler.


Deportations Under “Secure Communities” Don’t Make Us Safer, New Study Finds

From Issue 1.52 - By Virginia Eubanks
Virginia Eubanks  -  

The Department of Homeland Security’s (DHS) Secure Communities program is a partnership between immigration officials and local police, under which police check the immigration status of the people they arrest against a national database, and immigrant offenders may be deported. Despite the program’s name, however, a new study by law professors Thomas J. Miles and Adam B. Cox finds that the program results in “no meaningful reduction” in crime.

Secure Communities, first piloted during the George W. Bush administration, creates new data sharing partnerships among law enforcement agencies, the FBI and DHS. It has long been common practice for local law enforcement to send the fingerprints of an arrestee to the FBI’s National Crime Information Center for a criminal background check. But under Secure Communities, prints are also electronically transferred to the Department of Homeland Security, where they are checked against the Automated Biometric Identification System (IDENT) to see if the detained individual is deportable for immigration violations or criminal convictions. If deportable violations are found, Immigration and Customs Enforcement (ICE) takes enforcement action. According to Miles and Cox, in the first four years of Secure Communities, ICE detained over 250,000 immigrants and deported more than 80% of them.

Critics of the program have expressed grave concerns about its potential for civil liberties abuses and racial profiling. Chris Rickerd of the American Civil Liberties Union, for example, argues that Secure Communities has “devastating effects on families and people with no or minor criminal convictions; [is] complicit with racial profiling; [regularly detains] U.S. citizens and other lawful residents” and “harms trust between police and witnesses or victims of crime.” And, although the program is a centerpiece of the Obama administration’s immigration policy, implementation has been bumpy. Governor Pat Quinn of Illinois pulled his state out of the program in May 2011, and State’s Attorney Kamala Harris declared it “optional” in California in December 2012.

Though ICE describes the program as protecting communities by removing “criminal immigrants” who pose a threat to public safety, Miles and Cox point out in their report that very little research has tested the premise that deportations of immigrant offenders reduce rates of property or violent crime. Using the program’s rolling implementation in 3,000 jurisdictions as a “natural experiment,” Miles and Cox were able to compare county-level crime data before and after the introduction of Secure Communities. They found evidence that Secure Communities may have contributed to a modest decline in property crimes such as motor vehicle theft and burglary. But their study found that Secure Communities has led to no “meaningful reductions in the FBI index crime rate … [n]or has it reduced rates of violent crime—homicide, rape, robbery, or aggravated assault.” Thus, they conclude that, “the program has not served its central objective of making communities safer.”

The study corroborates the White House’s own internal concerns about the program, reports Leslye Davis in the New York Times. After meeting with law enforcement officials critical of the program in May, Jeh Johnson, the Homeland Security secretary, suggested that he’s taking a “fresh look” at the program, which might need to be “overhauled” and “rebooted.”

Johnson has not yet signaled what these changes might look like. But local police departments are increasingly resisting the demand to detain immigrants for ICE, and several cities across the country—including Philadelphia—have passed laws that limit cooperation between law enforcement at immigration officials.


Acknowledging Tensions, Judge Denies Advocates’ Request to Compel Ramadan License Plate Data

From Issue 1.52 - By Aaron Rieke
Aaron Rieke  -  

A Los Angeles Superior Court judge recently denied advocates’ request to compel license plate surveillance data covering a week in Ramadan. The decision grapples with the need to protect against the abuse of police technologies while, at the same time, protecting law enforcement investigations and individuals’ privacy.

Automatic license plate readers (ALPRs) are high-speed cameras that photograph vehicles. These images are automatically time-stamped, tagged with a location, analyzed for a license number, and stored in a database. Policies governing use of ALPRs vary widely. Some departments retain large numbers of records indefinitely.

ALPRs have been abused in the past. For example, police officers in New York reportedly drove “unmarked vehicles equipped with license plate readers around local mosques in order to record each attendee.” And according to one New York police officer, the use of ALPRs “is only limited by the officer’s imagination.”

In an attempt to test for similar abuses, the ACLU and the EFF sought to compel Los Angeles police to share a week’s worth of ALPR data under the California Public Records Act. “There has been quite a bit of government surveillance of Muslim communities in Los Angeles, and I thought getting license plate data from the last week of Ramadan might be able to tell us if the cops were focusing their surveillance efforts on Muslim communities and businesses during that week,” Jennifer Lynch of the EFF explained to Ars Technica.

Police refused to produce the data, and the request ended up before a judge. Weighing the public’s interest in the disclosure against the disclosure’s risks, Superior Court Judge James Chalfant ruled that police were not required to produce the records.

The judge acknowledged that the “intrusive nature of ALPRs and the potential for abuse of the ALPR data creates interest in disclosure of the data to shed light on how police are actually using the technology.” And more,

The misuse of ALPR technology can harm individual privacy and civil liberties. ALPR data can be used to record vehicles at a lawful protest or house of worship, track all movement in and out of an area, specifically target certain neighborhoods or organizations, or place political activists on hot lists so that their movements trigger alerts.

But he was also persuaded that the records would reveal officers’ patrol patterns and “would compromise it as an investigative tool by allowing criminals to find out whether police have been following him or her, or locate a third person they were trying to harm.” The judge also cited citizens’ “substantial” privacy interest” as a factor weighing against disclosure.

Going further, the judge also declined to require police to release a redacted version of the data (in which license plate numbers would be obscured by random numbers). He reasoned that “a criminal could still use APRL [sic] data to follow law enforcement patrol patterns ….”

The decision is thoughtful and acknowledges the strong public interest in ensuring accountable use of police technologies. It also tees up an important and difficult question: how can advocates help guard against abuse of police technologies, especially when the underlying data implicates individual privacy and confidential law enforcement practices?

Today’s public record laws might not provide an answer.


Also of Note: September 10, 2014

From Issue 1.52 - By Aaron Rieke
Aaron Rieke  -  
  • New York police will soon begin testing body-worn cameras in five high-crime precincts. And in Washington, D.C., police could start wearing cameras as soon as October.

  • “Hundreds of police departments across the nation have [police forces] with a white percentage that is more than 30 percentage points higher than the communities they serve,” reports the New York Times.

  • New student surveys, which closely monitor a “large constellation of data” about teachers’ performance may be gaining traction in the education reform movement.


NSA Telephone Case Inches Forward, Challenging Pre-Internet Legal Doctrine

From Issue 1.51 - By Aaron Rieke
Aaron Rieke  -  

A federal appeals court considered for the first time on Tuesday whether the NSA’s revealing “bulk telephony” database — which includes the times, durations, and phone numbers associated with billions of phone calls — is constitutional. The oral arguments, available on C-SPAN’s website, lasted almost two hours. Although the question will likely end up at the Supreme Court, the Second Circuit’s ruling will provide an important frame.

During oral arguments, one of the central legal questions was: Do Americans have an expectation of privacy in information they turn over to third parties? The answer will resonate far beyond the NSA’s telephone database.

The third-party doctrine says that we abandon our expectation of privacy, and thus many Fourth Amendment protections, when we give businesses certain information. For example, in 1979 the Supreme Court held that the government’s use of a pen register (a device that records all numbers from a particular telephone line) was not a violation of a person’s legitimate expectation of privacy. It explained: “[W]e doubt that people in general entertain any actual expectation of privacy in the numbers they dial. All telephone users realize that they must ‘convey’ phone numbers to the telephone company ….”

But since the advent of the internet, the power of this old reasoning has grown immensely. And on Tuesday, the judges seemed to appreciate this fact. Judge Gerard expressed concerned that the same logic applies to payment data. “The same third-party argument that you’re making [...] and the same relevance argument that you’re making under the statute apply to banking records and credit card records, don’t they?” he asked. And Judge Slack called into questions the third-party doctrine’s modern relevance. “The question is whether the technology hasn’t changed so much that the analysis that it’s just a pen register doesn’t work any more,” he noted.

At the Supreme Court, at least one Justice appreciates the third-party doctrine’s scope. Last year, in her concurring opinion in United States v. Jones, Justice Sotomayor wrote that the doctrine is “ill suited to the digital age, in which people . . . disclose the phone numbers that they dial or text to their cellular providers; the URLs that they visit and the e-mail addresses with which they correspond to their Internet service providers; and the books, groceries, and medications they purchase to online retailers.”

The Second Circuit panel will release its decision in the coming months.


A Brief Experiment With Powerful Surveillance Software in Boston

From Issue 1.51 - By Aaron Rieke
Aaron Rieke  -  

Attendees of a Boston music festival were unwittingly subject to a troubling surveillance experiment last year, reports DigBoston. In consultation with IBM, the city tested technology that “analyzes every passerby for height, clothing, and skin color.” Thousands of images were captured and analyzed. (The image below shows some of the same software being used in another city.)

IBM's software being used in another city.

The pilot program was designed to test a suite of “situational awareness software.” Boston official Kate Norton explains: “The purpose of the pilot was to evaluate software that could make it easier for the City to host large, public events, looking at challenges such as permitting, basic services, crowd and traffic management, public safety, and citizen engagement through social media and other channels.”

The Boston Police Department wrote that it was “not part of this initiative.” But DigBoston reports that internal photos show police observing the software during the festival.

For now, Boston is not continuing to use the technology. According to officials, the city recognized that the software presents many challenges including “legal and privacy concerns.” And Boston currently “lack[s] a policy guiding use of this software.”


As Ferguson Police Don Cameras, Civil Libertarians Stress Limits

From Issue 1.51 - By Aaron Rieke
Aaron Rieke  -  

Ferguson police officers began wearing cameras last weekend amid continuing protests over the shooting of Michael Brown. Two companies donated about 50 cameras to the city. Ferguson Police Chief Tom Jackson said that his officers were receptive. “They are really enjoying them,” he said. “They are trying to get used to using them.”

City leaders had committed to deploying the cameras. Elsewhere, body-worn cameras have shown promise by reducing use of force by police and complaints from citizens. And in the wake of Brown’s death, they are in increasing demand, including a petition on the White House website with more than 150,000 signatures.

Amid this attention, the ACLU published a blog post arguing that body-worn cameras should not be used beyond law enforcement. Police officers’ “extreme powers and history of abuse” justify careful deployment of cameras, but we should be cognizant of their privacy risks, writes ACLU Senior Policy Analyst Jay Stanley. For example:

Police officers enter people’s homes and encounter bystanders, suspects, and victims in a wide variety of sometimes stressful and extreme situations. . . . Perhaps most troubling is that some recordings will be made inside people’s homes, whenever police enter—including in instances of consensual entry… and such things as domestic violence calls.

However, body-worn cameras are likely to move beyond officers’ vests soon. For example, officials in Miami Beach, Florida recently announced plans to deploy body-worm cameras to its meter maids and fire inspectors.

“[W]e … have to draw some lines about where it is and is not appropriate to use this technology,” concludes Stanley. But in Ferguson, there are few arguing against it.


Also of Note: September 3, 2014

From Issue 1.51 - By Aaron Rieke
Aaron Rieke  -  
  • “Despite our country’s growing diversity, our public schools provide little contact between white students and students of color,” writes Reed Jordan on the Urban Institute’s MetroTrends blog. Visualizations of data from the U.S. Department of Education showing stark divides in schools’ racial composition:

Share of white kids attending majority-white schools

  • A shift in U.S. telecommunications policy — from requiring universal service to allowing companies to decide where to install high-speed internet — is raising concern about whether “residents of poor or underserved neighborhoods will be left behind,” reports the Wall Street Journal.