Knowing the Score: New Report Offers Tour of Financial Data, Underwriting, and Marketing

From Issue 1.59 - By Aaron Rieke
Aaron Rieke  -  

We here at Robinson + Yu have just released a new report, Knowing the Score, which provides a guided tour of the complex and changing world of credit scoring. It’s designed to be the “missing manual” for policy professionals seeking to better understand technology’s impact on financial underwriting and marketing.

The word “scoring” is used a lot these days. For example, a widely quoted New York Times story described a new crop of “consumer evaluation or buying-power scores . . . [which are] highly valuable to companies that want—or in some cases, don’t want—to have you as their customer.” A recent report from Privacy International inventoried a variety of “consumer scores”—such as measurements of online social media influence. And industry regulators have acknowledged a “big fuzzy space” between how different kinds of financial assessments are viewed by the law.

We were left with many questions: What are the legal and practical differences between a “credit score” and a “marketing score”? Are credit scoring companies that rely on social networking data reliable? Should new forms of payment information (such as cable and utility bills) be sent to credit bureaus? Can new scoring methods bolster financial inclusion?

Our report adresses all of these questions, providing historical and legal context along the way.

Knowing the Score, Figure 1

The key takeaways are:

  • Financial advocates should seriously consider advancing the inclusion of “mainstream alternative data” (such as regular bill payments) into credit files. This new data, which often goes unreported today, could allow credit scores to be calculated for more people, enhancing access to the mainstream financial system. However, the impact of this new payment information on credit scores is hard to analyze without access to proprietary credit bureau data. Thus, we encourage further collaboration and transparency between advocates and industry. We also emphasize that utility payment data carries special risks: it must be reported carefully so as not to interfere with state consumer protection laws.

  • The predictiveness and fairness of new credit scores that rely on social network data and other nontraditional data sources (including, for example, how quickly a user scrolls through a terms of service document) is not yet proven. We predict that to the extent these new methods are actually adopted, they may struggle to comply with fair lending laws.

  • Today’s most widely used credit scoring methods (such as the approach used by FICO) are fair in the sense that they accurately reflect individuals’ credit risk across racial groups. Many studies have documented huge differences in average credit scores between racial groups. But the best available evidence, a 2007 study conducted by the Federal Reserve, indicates that mainstream scoring models themselves are not biased–that is to say, they accurately predict individual credit risk, for individuals of all races. This means that racial differences in average credit scores are a map of real, underlying inequalities, rather than a quirk of the scoring system. It also confirms that credit scores can be a powerful yardstick by which to measure the fairness of particular financial products and practices.

  • Marketing scores, built by credit bureaus from aggregated credit report data, can be used to target advertisements and change the appearance of websites as individuals navigate the web. These marketing scores, computed on a household or block level, segment individuals based on their financial health. They can come within a hair’s breadth of identifying a person, which would subject them to the Fair Credit Reporting Act, but they appear to be operating just outside the scope of that law. Unfortunately, technological constraints make it difficult to understand through outside observation what effect these scores are having. We urge regulators to play a fact-finding role to learn more about how this data is used.

Also of Note: October 29, 2014

From Issue 1.59 - By Aaron Rieke
Aaron Rieke  -  
  • “[T]here has never been more confusion about what the term [anonymity] means,” reports the Wall Street Journal, explaining how new mobile apps often collect more data than meets the eye.

  • California Highway Patrol officers have been sharing explicit photos of female suspects for years as part of a ‘game,’ says an officer.

  • New America’s Open Technology Institute just released Data and Discrimination: Collected Essays, which claims that “digitally automated systems may be adding to [discrimination] in new ways.”

  • Powerful government authority granted by the Patriot Act — known as “sneak and peek” warrants — are often used for purposes other than terrorism, explains the EFF.

FBI Director Argues Need to Pierce Encryption

From Issue 1.58 - By Aaron Rieke
Aaron Rieke  -  

Law enforcement could be left “in the dark” as more people turn to encrypted, digital communications, warned the Director of the FBI, James B. Comey, in a speech last week. Catching kidnappers and child pornographers becomes harder, he argued, when law enforcement lacks the technical ability to conduct investigations despite legal authority to do so.

The FBI is asking the public rely on policy, rather than technology, to ensure lawful surveillance. Google and Apple recently moved in the opposite direction, announcing that their mobile operating systems would encrypt users’ data by default (such that only a user’s passcode would allow access to data on a device). But Comey argued companies shouldn’t “throw away the key.” Instead, he suggests, they should preserve the ability to respond to law enforcement requests.

This debate highlights a tension between investigating crimes and ensuring security and privacy. If a company can access a user’s data, it will inevitably be more vulnerable to hackers and spy agencies (both foreign and domestic). Thus, in the wake of Snowden’s revelations, companies are racing to provide better security, often in the form of encryption that they can’t feasibly reverse. “Just as people won’t put their money in a bank they won’t trust, people won’t use an Internet they won’t trust,” said Brad Smith, Microsoft’s general counsel.

The ACLU criticized Comey’s speech, claiming that he is “wrong in asserting that law enforcement cannot do its job while respecting Americans’ privacy rights.”

Comey is right in at least one respect: investigating crimes might be more difficult in a world where most consumer devices and communications are encrypted by default (a trend we are beginning to see). Unfortunately, most solutions seem to come at a steep cost to security.

“My goal today isn’t to tell people what to do,” Comey said. “My goal is to urge our fellow citizens to participate in a conversation as a country about where we are, and where we want to be, with respect to the authority of law enforcement.”

Personality Tests’ Fairness, Effectiveness Questioned

From Issue 1.58 - By Aaron Rieke
Aaron Rieke  -  

Employers are increasing their use of personality tests to hire for customer-service jobs, reports the Wall Street Journal. The tests — which might prompt applicants to say if they “experience mood changes” or “are always cheerful” — are said to help reduce attrition and predict performance. But some are questioning the tests’ fairness and effectiveness.

The Equal Employment Opportunity Commission (EEOC) is investigating whether personality tests might discriminate against people with mental illness, such as depression, even if they have the right skills for the job. “Employers are watching the investigation closely,” reports the Journal.

Employers have already scaled back testing variables in other contexts. For example, Xerox stopped considering applicants’ commuting time in its employment test because it worried that the data might put applicants from minority neighborhoods at a disadvantage. “There’s some knowledge that you gain that you should stay away from when making a hiring decision,” said Teri Morse, Xerox’s vice president of recruitment.

The overall effectiveness of personality tests is unclear. Some companies are pleased with the results. For example, Xerox claims that it is “shocked all the time” by tests’ accuracy. But others are more skeptical. According to Fred Morgeson, a management professor and organizational psychologist, the link between job performance and personality is “much lower than the field has led us to believe.”

Test vendors keep their formulas a secret. And at least one is actively resisting the EEOC’s efforts to acquire its validity tests.

ALPR Data Request Sent to California Court of Appeal

From Issue 1.58 - By Aaron Rieke
Aaron Rieke  -  

The Electronic Frontier Foundation (EFF) recently petitioned a California Court of Appeal to rule that the public has a right to know how Los Angeles police are using automatic license plate readers (ALPRs).

The petition — essentially an appeal of a judge’s denial to compel disclosure of ALPR data collected during a week in Ramadan — argues that knowing how police technologies are used is a vital prerequisite to an informed, public debate.

Los Angeles police are among the biggest gatherers of ALPR data. ALPRs are high-speed cameras that photograph vehicles. These images are automatically time-stamped, tagged with a location, analyzed for a license number, and stored in a database. Over time, ALPRs can create a detailed history of a person’s travels and habits.

ALPRs can be used to scan and record vehicles at a lawful protect or house of worship; track all movement in and out of an area; gather information about certain neighborhoods or organizations; or place political activists on hot lists so tat their movements trigger alerts.

The EFF highlighted the fact that public debates about ALPRs have been triggered in other cities, including Minneapolis and Boston, following news reports.

Without public access to information about how ALPR technology is being used—including the raw ALPR data from a limited time period—the very people whose whereabouts are being recorded cannot know if their rights are being infringed nor challenge policies that inadequately protect their privacy.

The petition also addresses some important legal issues, including whether indiscriminate surveillance data rightly counts as “records of investigation” that might be withheld under California’s Public Records Act, and whether disclosing anonymized ALPR records present any meaningful risks to individuals or police activities.

Also of Note: October 22, 2014

From Issue 1.58 - By Aaron Rieke
Aaron Rieke  -  
  • 40 percent of adult Internet users have experienced harassment online, according to a new study by the Pew Research Center. Young women report experiencing “severe types of harassment” at “disproportionately high levels.”

  • Some Virginia police departments are “secretly and automatically sharing criminal suspects’ telephone metadata and compiling it into a large database,” reports Ars Technica.

Foundations Donate Police Tech at Expense of Transparency

From Issue 1.57 - By Aaron Rieke
Aaron Rieke  -  

Private foundations provide police departments with technologies that receive less scrutiny than those purchased with public money, reported Pro Publica this week. Large charities have donated controversial surveillance equipment, including “hundreds of thousands of dollars’ worth of license place readers” and upgrades to “stringray” devices that vacuum up signals from mobile phones.

These donations may be yet another challenge to the accountable deployment of police technologies. The gifts are often hard to track and allow police to accumulate tools that are already far ahead of policies. (We’ve written extensively on both ALPRs and stingrays.)

Police foundations have been around for decades. The first was established in New York in 1971. Today, similar organizations have cropped up in “dozens of jurisdictions, from Atlanta, Georgia, to Oakland, California.” Some foundation activities are uncontroversial, like providing medical kits to treat gunshot wounds. But recently, the largest foundations have begun supporting “technology initiatives” that include surveillance tools. In Atlanta, for example, a police foundation “bankrolled the surveillance cameras that now blanket the city, as well as the center where police officers monitor live video feeds.”

The foundations’ gifts can be difficult to vet and monitor. “At least with public contracts and spending, there’s a facade of transparency and accountability,” said Ana Muniz, who has studied the LAPD’s gang policing efforts. “With private partnerships, with private technology, there’s nothing.” There is also concern about conflicts of interest: it’s not uncommon for companies to donate to the foundations that purchase their products, or to be contractors for associated police departments.

Online Payday Loans Often Worse than Those Offered by Storefronts, Says New Report

From Issue 1.57 - By Aaron Rieke
Aaron Rieke  -  

Online payday loans are often worse than those provided by storefront lenders, explains a new study by Pew, one of the first formal analyses of internet payday lending.

Today, about one-third of all payday loans originate on the web. Online lenders’ revenue has tripled since 2006, despite the fact they incur high loss rates.

Online loans differ from their offline counterparts in several important ways. They often rely on direct, electronic access to borrowers’ bank accounts (whereas offline payday loans tend to rely on postdated checks). They may offer nontraditional repayment structures that are difficult for borrowers to understand. And they tend to come with higher fees — almost double those of offline loans, by some analysts’ estimates.

From a borrower’s perspective, online loans can have notable detrimental effects:

  • One third of online loans use payment structures that do not reduce their principal. This encourages long-term indebtedness.
  • Online borrowers reported abuse at rates significantly higher than offline borrowers. Online lenders often threatened “contacting family, friends, or employers, and arrest by the police.”
  • More than a third of online borrowers reported that their personal or financial data was sold without their consent. These disclosures are likely tied to the practices of “lead generators” — websites that collect applicants’ personal information to sell to lenders. Lead generators sell these lists widely.

Pew’s report was not universal in its criticism, and noted that a handful of online lenders “are the subject of few complaints” and have “called for raising industry standards.” But on the whole, it’s clear that online lending still has a long ways to go.

Also of Note: October 15, 2014

From Issue 1.57 - By Aaron Rieke
Aaron Rieke  -  
  • The government removed seven Americans from the no-fly list, complying with a federal judge’s previous ruling that processes to challenge placement on the list were “wholly ineffective.”

  • The Justice Department is reviewing an incident in which the DEA created a counterfeit Facebook profile of a woman without her consent.

  • In Los Angeles, a group of public and private groups have earmarked more than $200 million to improve computerized systems that connect the homeless population with services. “If you can imagine being at an airport, and one of your flights get canceled, and then you’re just running from gate to gate to gate,” said Chris Ko of United Way, “that’s the experience homeless people are going through.”

  • Some advocates claim little has changed regarding surveillance of Muslims in New York City. “If you weren’t following New York City politics and didn’t know we had an election last year, you wouldn’t see a difference in the response around surveillance, from the last administration to this administration,” said Linda Sarsour, executive director of the Arab-American Association of New York.

Third Circuit Gives Police Leeway on Surveillance Tech

From Issue 1.56 - By Aaron Rieke
Aaron Rieke  -  

Last week, the Third Circuit Court of Appeals ruled that police may be allowed to use evidence collected using technologies that are not subject to settled law.

The case, U.S. v. Katzin, focused on the “exclusionary rule,” which seeks to deter law enforcement officers from violating the Fourth Amendment by making evidence derived from an illegal search or seizure unavailable at trial. The rule tries to balance protection of civil liberties with the social cost of repressing truthful evidence. Thus, when police act in good faith, the rule does not apply, even if they make a mistake.

Police are already treading on new ground with facial recognition, mesh networks, and drones. Use of these technologies, and many others, will outrun courts and state legislatures. Thus, the boundaries of the exclusionary rule — what qualifies as a “good guess” regarding the legality of cutting-edge surveillance techniques — will becoming increasingly important.

In the case, law enforcement tracked electrician Harry Katzin’s van with a GPS device and without a warrant. But the tracking took place in 2010, before Supreme Court ruled that such activities violated Fourth Amendment. Prosecutors sought to use the GPS evidence at trial, arguing that the good faith exception applied. The ACLU summarized

[T]he Department of Justice is basically saying, “Ok, we won’t appeal the substance of your ruling — this warrantless GPS tracking violated the Fourth Amendment. But how could the police officers have known this at the time?”

The Third Circuit ruled that the police acted reasonably, even though they did not rely on binding judicial precedent. But a number of judges dissented, arguing that police should have stronger incentives to act conservatively.

The essence of the majority’s holding is that any time a course of conduct by the police, particularly regarding technological advancements, has not been tested or breaks new ground, law enforcement will be entitled to the good faith exception.

Although the judges disagreed about how much leeway to offer, even the more permissive judges urged caution. “[A]fter Jones, law enforcement should carefully consider that a warrant may be required when engaging in such installation and surveillance.”