Stay up to date on the latest learnings from the Round community by subscribing to our newsletter. Subscribe

Designing Less Biased Products & Algorithms

Round Editorial

Building responsible and equitable algorithms is one of the toughest challenges technologists face today. Expert Cathy O'Neil explains why and what to do about it.

Ryan Fuller, Co-Founder and CEO at Round, sat down with Cathy O’Neil, a Harvard and MIT-trained mathematician, algorithmic auditor, and author of the New York Times Bestseller Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.

Fuller and O’Neil chatted about the power and inherent dangers of predictive algorithms, how to design algorithms that are both efficient and fair, through an ethical matrix created by her algorithm auditing organization ORCAA, what responsibility tech companies have when things go wrong, and more. Here are five key takeaways from their hour-long conversation:

1. The most problematic predictive algorithms touch everyday aspects of our lives

The basic mechanics of an algorithm are fairly simple (and sound innocuous). A data scientist creates a set of rules for a computer that tell it how to interpret a set of data. Then, the data scientist feeds data into the algorithm and suddenly you have what seems like an accurate predictor of future success. But as we’ve seen throughout history, predictive algorithms that impact very basic functions, such as who gets a mortgage or who gets a raise, have evolved into “weapons of math destruction”.

Upon reflecting over the past five years since releasing her New York Times Bestseller, Cathy believes we’ve made progress, given many consumers no longer believe that algorithms are scientific instruments that can be trusted blindly. There’s more work to do though, given many algorithms (mainly B2B) still remain unchecked and unvetted.

2. Humans are imperfect, but they can create fairer and more equitable algorithms

Humans and algorithms have one thing in common: They’re imperfect. The difference is that even when a human has decent internal values, algorithms aren’t designed to operate based on your personal beliefs. Out of the box, they’re designed to operate based on historical data, which tends to be biased. So what can you do to create fairer and more equitable algorithms? Cathy dives into her ethical matrix framework:

The goal of an ethical matrix is to identify all stakeholders, both internally and externally, in a given algorithm - e.g. who will be impacted by this algorithm one way or another - and to have a meaningful conversation about how algorithms aren’t inherently equitable, and more importantly, what it would mean to them for the algorithm to succeed or fail. By building and implementing tests you can see the extent to which a problem is actually happening, providing an opportunity to mitigate them. Learn more at orcaarisk.com.

3. There’s an opportunity to create a professional society for data scientist

What can data scientists do when they see something or are asked to do something they find immoral? Right now, not a whole lot outside of using whistleblower protections. While Cathy believes that a version of The Hippocratic Oath for data scientists would be too vague, she does think there’s an opportunity to form a professional society, like lawyers have with the American Bar Association or electronic and electrical engineers have with IEEE.

Sure, professional societies are far from a panacea in any industry. But data scientists should not have to decide between their morals and their careers. But as Cathy says, this is a complicated two-sided conversation. “Nobody backs us up when we say we shouldn’t do something,” she adds. “But we also never get in trouble for doing something.”

4. Tech companies have a responsibility to anticipate what could go wrong

It’s no secret that success at Facebook is defined by profit- the longer you stay on the platform and the more things you click on, the more they make money. Hence, Facebook’s algorithm emphasizes and prioritizes pieces of content that divide us, anchor us, and make us fight—and minimizes that kind of content that makes us realize that we’re wasting our time on the platform. But is it possible for Facebook to optimize for something else, perhaps for truth or decency?

<quote>“Every time you hear someone like Mark Zuckerberg or one of those guys say that artificial intelligence is going to fix the problems, you should be rest assured that it’s not.”</quote>

5. Algorithms are replacing too many important conversations that we should be having with each other

Let’s say that you’re a manager at a big tech company and you’re in charge of performance reviews. To determine who’s “productive,” you use a pre-existing HR algorithm that defines a productive employee as someone who gets high marks during their year-end performance review cycle. On paper, that might make sense—until you consider the biases that exist in the performance review process.

If we accept an algorithm as fact, we end up skipping some really important questions about the biases that may exist across an entire organization. Algorithms can make important business decisions easier (and faster), but when we trust them blindly, we ultimately end up ignoring human problems.

Recommended Articles

How AI Will Change Your Engineering Organization in 2024

Jesse Knight
December 19, 2023
View Article

In 2024 Artificial Intelligence will mark a pivotal moment in the evolution of engineering. This transformation will redefine the very essence of engineering workflows, decision-making, and innovation. Read on for predicitions of how these changes will play out in engineering orgniazations across the coming year.

blog-img

From Tension to Trust: Building a Successful Partnership Between Product and Sales

Mansi Kothari
January 18, 2024
View Article

Some tension can be healthy and productive, but a lot of it can negatively impact culture and collaboration within the organization. Mansi Kothari, VP of Product at Parently, discusses building successful partnerships between product and sales teams.

blog-img

Beyond Energy Efficiency: How A Tech Leader Drives Sustainability Through Mindful Design

Erik Karasyk
December 11, 2023
View Article

Every designer working today is trying to make the world a better place, yet it is often forgotten that we are not only designing for people but also for the planet.

blog-img
header-logo
social-iconssocial-icons
Subscribe to our monthly newsletter
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

© 2023 Round. All rights reserved | Privacy Policy | Terms of Service