Julia Dressel ’17 and Research That Went Viral

News subtitle

A Dartmouth study questions software accuracy in predicting criminal behavior.

Image
Image
Julia Dressel '17
Julia Dressel ’17 is deeply interested in technology and its impact on society. (Photo courtesy of Julia Dressel)
Body

Can a software program predict crime? That’s a question Julia Dressel ’17 and Hany Farid, the Albert Bradley 1915 Third Century Professor of Computer Science, set out to answer.

When their findings, with Dressel as first author, were published earlier this year in the journal Science Advances, media outlets everywhere took notice, and the news went viral.

The media coverage, which took Dressel by surprise, resulted in hundreds of print, online, and broadcast news stories around the globe.

“When I started this research almost two years ago, I never expected to one day do an interview about my work with The New York Times,” says Dressel.

Dressel was also invited by Harvard’s Berkman Klein Center for Internet and Society to speak about her research. She addressed the group on Tuesday, March 6, in a presentation at the Harvard Law School campus.

A double major in computer science and the Women’s, Gender, and Sexuality Studies Program, Dressel is deeply interested in technology and its impact on society, particularly in the criminal justice system.

Working with Farid, she developed a senior thesis focused on how technology and racial bias affect the prediction of recidivism. She showed that people with no criminal justice experience, responding to an online survey, performed as well as the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) software system, used by courts to predict criminal behavior. With only seven factors, the survey respondents were accurate in their predictions for 67 percent of the cases presented, statistically the same as the 65.2 percent accuracy of COMPAS.

“It is troubling that untrained internet workers can perform as well as a computer program used to make life-altering decisions about criminal defendants,” says Farid.

Dressel worked with Farid for a year and a half. “He was very involved in my research, providing great feedback every step of the way,” she says. “He always had great ideas for the research, and he had the experience and expertise to know that the research had the potential to get published.”

As the media calls came in, Farid referred them to Dressel. “This helped the reporters take me seriously, and I never felt like they questioned my credibility because this was only undergraduate research,” says Dressel, “but I am not surprised it resonated so strongly with the public. When people hear there is a technology that isn’t trustworthy, they pay attention.”

“We have, over the past few decades, unleashed technology on the world without fully comprehending its negative effects,” says Farid. “Julia brought a combination of strong mathematical and computing skills with a deep understanding and caring of the underlying sociological issues together to carry out her research. I am sure that her work will have an impact on the conversation regarding the wisdom and fairness of relying on computer algorithms to make decisions in the criminal justice system and other arenas.”

“Our research suggests that age and total number of previous convictions are the two most predictive criteria of recidivism used by COMPAS,” Dressel says. “The software is only moderately accurate so it makes lots of mistakes.

“On a national scale, black people are more likely to have prior crimes on their record than white people, thus black defendants in general are more likely to be classified as medium or high risk. Mathematically, this means that the false positive rate is higher for black defendants than white defendants, leading to racial bias in recidivism prediction,” she says. “This is the perfect example of how technology can reinforce existing systematic inequalities.”

Dressel says that that her research might not have happened without her background in women’s, gender, and sexuality studies. “Taking those classes helped me understand how to think about problems that affect individuals and how to think critically about the ways that any sort of bias manifests itself,” she says. “This particular thesis happened because of ways that I have been taught at Dartmouth to think about problems that are affecting our society.”

Since graduating last year, Dressel has been working as a software engineer at Apple in California. “I am currently enjoying how much I am learning at Apple, but I am definitely interested in pursuing a career in criminal justice reform. My future may include graduate studies in software engineering or criminal justice or both,” she says.

Joseph Blumberg