Big Data Has Potential to Both Hurt and Help Disadvantaged Communities

In the future, all aspects of daily urban life might be tracked and translated into data points. Local governments and companies collecting this type of information are already testing out potential uses. According to some watchdogs, without a holistic look at how data collection and algorithms impact different communities, this has the potential to reinforce already rigid structural barriers to economic and physical mobility.

The Chicago Police Department’s experiments with predictive policing have incited worries about unfair profiling in black communities. Boston has tested out situational awareness software, which uses mass surveillance and face-recognition technology as a safety measure for large-scale assembly — with search queries capturing skin tone on a scale of 1-100. Data might also be used in a sort of 21st-century redlining, with banks and healthcare companies using informational leverage to deny service to people living in low-income communities.

“A big part of what’s happening across society today is that major institutions are increasingly using computers to make the decisions that shape people’s lives. These processes are often opaque,” says David Robinson, of Robinson + Yu, a firm that provides technical expertise to help social justice advocates engaging in big data issues, and that recently released a new report called “Civil Rights, Big Data and Our Algorithmic Future” that points to the possible upsides and pitfalls of this information-based future. “People need to feel a sense of empowerment around these algorithmic processes. I think there’s a real cultural tendency to defer to decisions that come from a computer and to feel like if an algorithm has rendered some decision then it must be fair or we can’t understand it or it shouldn’t be scrutinized.”

Obviously intentional discriminatory practices are one thing, but uncovering unintentional discrimination is in murky and uncharted legal territory, according to Solon Barocas, of Princeton’s Center for Information Technology Policy. A recent report that he co-authored studied the disparate impact of big data on vulnerable communities. “We need to be extremely sensitive to the very subtle way that things can produce a disparate impact,” says Barocas, “And having that sensitivity means knowing about the data that you’re working with.”

LEARN MORE  California State Synergy To Use Open Data

Barocas’ report cites Boston’s Street Bump as an example. When smartphone users drive over Boston potholes, the widely acclaimed app reports the location to the city. While inventive, the differences in smartphone ownership across Boston’s populations might cause the app to unintentionally underserve the infrastructural needs of poorer communities.

“Historically disadvantaged communities tend to be simultaneously over-surveilled — if you are a part of the welfare system, you have a lot of information being collected by the state at all times — and severely underrepresented, because you might not be an attractive consumer,” says Barocas. Credit scores are a popular example that show how people outside of the formal economy have a hard time registering enough information to qualify for loans, but new alternative metrics come with their own dangers.

The questions that data miners ask and the way that the results are categorized are extremely important. Barocas brings up an anecdote about Evolv, a San Francisco startup that develops hiring models. In searching for predictors for employee retention, the company found that employees who live farther from call centers were more likely to quit. But because the results also could have an unintentional link to race, Evolv declined to use that information as a caution against violating equal opportunity laws.

“You can use data mining to do something completely different,” Barocas points out. “You could ask ‘If I adjust workplace policies or workplace conditions, might I be able to recruit or retain different people?’” Rather than blindly using data that might unintentionally discriminate, employers can intentionally reverse prior hiring practices that have adversely affected job candidates based on their race, ethnicity, gender and income.

LEARN MORE  The Convergence Of Technology And Geopolitics - Implications For The Global Order

“I think that the lesson of history is that when powerful institutions are designing processes, or when markets are creating new practices in terms of decision-making,” comments Robinson, “we need to check and make sure that they are done in a way that is consistent with human rights and that we shouldn’t be just assuming that they are.”

The future doesn’t have to feel like The Minority Report. Armed with expertise on technical issues, civil rights groups and social justice organizations can play an advisory role to companies and governmental institutions wielding these large data sets. Barocas imagines a future where big data is one of the best tools to expose persistent discrimination.

“What might be interesting is that this really technical thing might be a way of showcasing how it’s actually impossible to avoid having a frank discussion about the acceptability of inequality in society,” he says. “A lot of these techniques will expose the extent of inequality and actually exacerbate it. In a way, it’s an opportunity to have a conversation that many civil rights organizations have always wanted to have, which is that this is not just a matter of conscious prejudice, but actually structural inequality and structural prejudice.”

This article originally appeared in Next City.



For enquiries, product placements, sponsorships, and collaborations, connect with us at [email protected]. We'd love to hear from you!



Our humans need coffee too! Your support is highly appreciated, thank you!
Total
0
Shares
Previous Article

The World's Healthiest Cities

Next Article

Smart slums: Utopian or Dystopian Vision of the Future?

Related Posts
Total
0
Share