IEET > Rights > HealthLongevity > GlobalDemocracySecurity > Vision > Fellows > Evan Selinger
Frank Pasquale unravels the new machine age of algorithms and bots
Evan Selinger   Feb 1, 2015   csmonitor.com  

In his book “The Black Box Society,” Pasquale exposes secret algorithms behind the scenes of corporate America.

‚Äč

Slate recently said Frank Pasquale's new book, "The Black Box Society: The Secret Algorithms That Control Money and Information," attempts to "come to grips with the dangers of 'runaway data' and 'black box algorithms' more comprehensively than any other book to date.' 

I recently spoke with Pasquale about his new book and about how algorithms play a major role in our everyday lives — from what we see and don't see on the Web, to how companies and banks classify consumers, to influencing the risky deals made by investors.
 
Edited excerpts follow

SelingerWhat's a black box society?    

Pasquale: The term ‘black box’ can refer to a recording device, like the data-monitoring systems in planes, trains, and cars. Or it can mean a system whose workings are mysterious. We can observe its inputs and outputs, but can’t tell how one becomes the other. Every day, we confront these two meanings. We’re tracked ever more closely by firms and the government. We often don’t have a clear idea of just how far this information can travel, how it’s used, or its consequences.

Selinger: Why are secret algorithms so important to the story you’re telling?

Pasquale: Sometimes there are runaway algorithms, which, by themselves, take on very important decisions. They may become even more important in the future. For example, autonomous weapon systems could accidentally trigger skirmishes or even wars, based on misinterpreted signals. Presently, algorithms themselves cause problems or snafus that are not nearly as serious but still foreshadow much more troubling developments. Think of the uncontrolled algorithmic trading that led to a major stock market disruption in 2010, or nearly destroyed the firm Knight Capital. Similar technology is now used by small businesses, with occasionally devastating results, as the program "The Spark" at the CBC recently pointed out. Credit scores can also have a direct, negative impact on individuals, without them knowing the basis for sharp changes in their scores.

But one thing I emphasize in the book is that it’s not only – and often not primarily – the algorithms, or even the programmers of algorithms, who are to blame. The algos also serve as a way of hiding or rationalizing what top management is doing. That’s what worries me most – when “data-driven” algorithms that are supposedly objective and serving customers and users, are in fact biased and working only to boost the fortunes of an elite. 

Selinger: Are you talking about people diffusing power and masking shady agendas through algorithms and hoping they won’t get caught? Or are you suggesting that delegating decisions to algorithms is a strategy that actually immunizes folks from blame?

Pasquale: I think both things are happening, actually. There are people at the top of organizations who want to take risks without taking responsibility. CEOs, managers, and others can give winks and nudges that suggest the results they want risk analysts and data scientists to create. Algorithmic methods of scoring and predictive analytics are flexible, and can accommodate many goals. Let’s talk about an example that’s currently being litigated. It concerns a ratings agency that exaggerated the creditworthiness of mortgage-backed securities.

One key part of the case comes down to whether 600,000 mortgages should have been added to a sample of 150,000 that clearly served the interests of the firm’s main clients, but was increasingly less representative of the housing market. Turns out that once the sample increases the loans at the heart of the housing crisis start to look more risky. And so here’s the problem. By avoiding the data, you can give AAA ratings to many more clients who pay top dollar for the ratings than you’d be able to if you used more accurate information.

Click Here to read more...

Evan Selinger is Associate Professor of Philosophy and MAGIC Center Head of Research Communications, Community & Ethics, both at Rochester Institute of Technology. Evan publishes extensively in the areas of philosophy of technology, privacy, and ethics/policy of science and technology. To enhance public debate about ethics, Evan regularly supplements his peer-reviewed scholarship with outreach articles in places like The AtlanticWiredSlateForbes,The Wall Street Journal, and The Nation.



COMMENTS No comments

YOUR COMMENT Login or Register to post a comment.

Next entry: Is Consciousness Representational?

Previous entry: Does Work Undermine our Freedom?