Digital Justice

Big data holds promise and problems for the practice of law

By Christianna McCausland

Say you conduct a search for T-shirts on Google. For weeks after, you will likely see pop-up ads for T-shirts in your social media feeds. We all know, and to some extent accept, that our digital data is gathered, scraped, sold and exchanged. Being pestered and stalked by computerized Big Brother is part of life in the 21st century. But for many people, digital profiling is serious.

“People have a sense that this is creepy or annoying, but the consequences for the upper and middle classes aren’t as dire as they are for those experiencing poverty,” says Michele E. Gilman, Venable Professor of Law and director of the Saul Ewing Civil Advocacy Clinic at the School of Law. “These same algorithms serve as gatekeepers to housing, credit, employment, education and healthcare, and they shape our workplaces.”

Tenant screening is one area where algorithm use is insidious. A tenant could be consistently turned down for housing due to a screening algorithm that contains inaccurate information. More alarming to Gilman is that often the algorithms that impact these decisions operate without transparency. The tenant may have no idea that the screening algorithm is the reason they have been turned down, they may not know the screening was conducted, and they will have little to no recourse against it, even if they did. Often the tenant’s attorney isn’t aware of the algorithms’ existence either, stymying their ability to effectively advocate for the client.

Gilman recently completed a fellowship at research organization Data & Society, and she published a guide for poverty lawyers on this topic, Poverty Lawgorithms. She notes that individuals cannot fully protect their data privacy, even if they opt out of things like social media accounts. Data collection is everywhere — in your credit score, in details you dropped into a health app that doesn’t fall under HIPAA protections, and in state and federal records.

“We don’t have a comprehensive data privacy law like our counterparts in the European Union. It’s fractured,” she states. “We rely on notice and consent and privacy policies that you must accept to utilize a service. But notice and consent is a myth because no one reads those notices, understands them, or can negotiate their terms.”

Algorithms are problematic. They automate tasks for greater efficiency, but they are coded by humans and thus prone to bias. They can scrape data from unreliable sources or misinterpret it. Gilman notes that someone who lost their job due to the pandemic, for example, then fell behind in rent and has an eviction claim against them, could have that eviction follow them for years, even though it legally falls under pandemic protections from federal eviction moratorium mandates.

Yet big data is not going away. If anything, society’s reliance on automated decision-making is likely to increase. Algorithms make decision-making quicker, easier and cheaper.  

“We need to raise awareness,” says Gilman. “As lawyers, we need to learn to spot obstacles impacting our clients and, once you know this is happening, create a legal strategy to ameliorate the harm being done.”

Colin Starger
Matthew Stubenberg

Lawyers as informed tech consumers

The law school tackled this issue head-on with the creation of Coding for Lawyers, a course on how to write computer programs, and in creating its Legal Data & Design Clinic (LDDC). While it may seem incongruous to have a coding course in law school, Colin Starger, professor of law, creator of the course and director of the LDDC, says it’s essential for contemporary lawyers to be informed consumers of tech. 

“The idea is to give lawyers and law students a deeper understanding of what can and can’t be automated in law,” says Starger, who this year was promoted to associate dean for academic affairs. “It’s important for lawyers to have digital literacy because technology has changed, and will continue to change, how law is practiced.”

Matthew Stubenberg is an associate director of legal technology at the A2J Lab at Harvard Law School and a Baltimore Law adjunct professor who teaches the Coding for Lawyers course. For his part, Stubenberg advocates for open data if for no other reason than, “the nefarious actors have the data — the good guys should have access too.” He says that while the law school course is not meant to create software developers, it’s important for lawyers to have an understanding of how technology operates. 

“If you work for a car company, you don’t need to know how to build a car, but you should understand the basics,” he states. “Law school is the time to delve into the technology divide, not when you’re an associate at a conference table with your new tech client feeling totally overwhelmed.”

Understanding what can be automated in law is also important for students as they look at career development. Many legal tasks are headed toward automation, which could make certain lawyerly roles obsolete. Conversely, law tech is booming, creating opportunities for lawyers looking for nontraditional career paths. Lawyers who are informed about how technology operates and how it can be shaped can play a critical role in legal technology’s future. 

“People think computers are magic, that if an algorithm says something is right, it is,” says Starger. “That’s not true. If you understand how [algorithms] work, you’ll understand their limitations, what they’re good at, what they’re bad at and the dangers of asking them to do a task they’re not good at. You won’t get fooled by people making tall claims, and you can leverage technology for good.”

This is the double-edged sword of algorithms. Despite their flaws, they also hold the power to fuel tremendous positive change. Starger points out that because computers can quickly and efficiently analyze vast data sets, they are an effective way to spot trends, like the underpinnings of structural racism.

“Using big data and computers, you can analyze a social or legal system so you’re not just dealing in anecdotes,” says Starger. “Evidence-driven reform is largely facilitated by quality data from reliable sources.”

Starger saw this play out in his own work helping to end mass incarceration, which includes helping lead the law school’s Pretrial Justice Clinic. He explains that two myths emerged to solve the issue: provide relief to nonviolent drug offenders and get rid of private prisons.

“But if you look at the data, a very small percentage of the people in jail are nonviolent drug offenders. The largest single percentile of people in state prisons are there for violent offenses. Similarly private prisons, while there’s much to dislike about them, are less than two to maybe five percent of the prison population,” says Starger. 

“That’s important to know because if we’re going to confront this crisis we need to confront our understanding of violence, the sources of it and whether our punitive response makes sense. If you didn’t proceed on data, you would only be addressing the low-hanging fruit of nonviolent drug offenders as a major driver in mass incarceration.”

Using algorithms to correct injustice

Starger explains how data has also scored victories in his law clinic work. The Pretrial Justice Clinic, as part of a coalition, was able to change rules to end unaffordable money bails. In a recent legislative session, the Legal Data & Design Clinic submitted testimony in favor of ending the practice of suspending drivers’ licenses as a punishment for unpaid traffic violations. In both cases, data indicating these rules’ disproportionate impacts on low-income people of color played a key role.

A striking example of data’s ability to move the dial on social justice issues can be found in Stubenberg’s Maryland Expungement website. He built the site in 2015, assuming it would be useful for self-represented users looking to clear their records, but attorneys constitute the bulk of the site’s traffic.  

“The process of expungement is fairly straightforward, but you need to understand eligibility, and there are a lot of forms —it’s a very time-consuming process,” says Stubenberg. “The way the site works is, you type in a case number and a scraper on the back end pulls the information back to the site, runs it through an algorithm to determine if it’s eligible, and then populates all the forms.”

The site took a tedious process and greatly expedited it. To date, more than 118,091 expungements have been executed via the site. 

Stubenberg concedes that algorithms have the potential to make problems worse, but he also believes they can make them go away. For example, algorithms are often cited by critics as biased, but Stubenberg believes they have the power to remove bias. Take, for example, judges at sentencing. 

“There’s already a bias in [judges] — or it may not even be theirs because they’re dealing with the information fed to them by the prosecutor, police and defense attorneys,” he says. 

Not only is that bias hard to measure, he explains, it is time-consuming, expensive and often ineffective to correct. 

“But if you have an algorithm and you measure that bias, you can tweak that algorithm until it’s more fair,” he continues. “That creates a more equal playing field than if you’re dealing with one judge or police officer who was just having a bad day.”

 

Systemic reform is a place where Gilman feels data can play a positive role. She was encouraged by the move in the last Maryland legislative session for non-conviction criminal records to be automatically expunged. Those arrests, even without a conviction, can haunt an individual as they try to gain housing and employment. She would like to see the expansion of eviction expungement so tenant screening companies can’t scrape that data. This sort of large-scale law reform is more effective than tackling cases one at a time. 

“To really solve systemic harm, you need a systemic solution,” says Gilman. “There’s so much we could do around housing records, coerced debt … there are so many areas where we could nip this at the source, rather than trying to fix it after the individual has been harmed.” 

Systemic reform will also need to be holistic, as numerous algorithms might impact a lawyer’s client. A housing lawyer helping in a landlord-tenant case might have a client who is also being harmed by an employment algorithm, or an inaccurate credit score. Big data is taking law outside its traditional siloes. 

Law students today are digital natives. While they don’t need to be software developers to succeed in a technology-driven world, their awareness and intuitive connection with technology can benefit the future of digital justice.

“A lot of lawyers feel they lack the technical expertise to challenge a computer,” says Gilman. “We want people to have a healthy skepticism of computer-generated outcomes and give them to the confidence to do this work.” 

Christianna McCausland is a writer based in Baltimore.

Share this story with your network:

Share on facebook
Facebook
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on email
Email

Leave a Reply

Your email address will not be published. Required fields are marked *