How to Vet Technology for Diversity

Angela wanted to use the new virtual try on service, so she could see what different lip shades look like on her skin. But when she opened it up a message appeared saying “no face detected”. Her friend who’s white was able to see the different products virtually applied on her skin. But Angela, a black woman, could not. This is why companies need to Vet Technology for Diversity.

Image retirée.

 

 

When facial recognition systems were being initially created, the grad students used pictures of classmates, predominantly white and asian men. Unsurprisingly this technology works especially well for those groups. New technology tends to be tested by those who are most easily available to the creator, leaving room for potential shortcomings with other populations. 

 

Why these problems happen?

 

In Canada 3.5% of the population is black. Of that most black people have lighter skin tones. That means an algorithm can be 96.5% accurate in Canada and not work at all for especially dark-skinned people. This means that if a technology company simply takes a random sample of a population, their technology will be biased towards the demographics of the population. 

 

Image retirée.

 

This becomes an open-ended technical question. What proportion of the dataset should be of black people? Do you create separate databases for each race and measure performance for each? How should you collect data in such a way that includes everyone?  Exactly how well does a product need to address these questions before it’s ready to bring to market? These questions have no definitive answer or protocol. In an business ecosystem that embraces the expression “go fast and break things,” some companies may not be considering inclusive algorithms at all.

 

On the one hand, inclusivity has costs. It’s expensive to compile vast datasets that are inclusive of all people. It’s easier and cheaper to test products solely with people in your network. This can reduce time to market, technical complexity and ultimately the cost to build a product. 

 

On balance, though, a lack of inclusivity can ultimately be costlier. A company that adopts uninclusive technologies risks branding itself as being more interested in padding its wallet then making the world a better place. Phrased another way: inclusivity is an essential part of goodwill, and goodwill is an essential part of marketability.

Steps to Vet Technology for Diversity.

The technology partners you choose to work with reflects back on your brand. This is especially true for any customer-facing systems. To vet your technology partners you should:

Check Their Ad Copy

Is diversity something this company appears to care about? On the company’s LinkedIn pages, websites or advertisements do show images with diverse populations? Do they have any blogs discussing algorithmic inclusivity or diversity?

Ask about their processes

Any company should be aware of how their algorithms work for people of all races. They should be able to clearly express how they are addressing biased algorithms. Do they have a process they can explain in simple terms? They should be able to tell you where some problems may be and how they are solving them. 

Test it

Get a diverse group of peers to try the system. See if experience differs in any way. What would ideal behavior look like? If anything doesn’t work, provide feedback to the company so they have the opportunity to improve it.

Be candid

It can be hard to make sure a product works for everyone. There are 8 billion people in the world, and unless you test with everyone, you cannot be sure it works for everyone. Your voice and expectations can help guarantee technology partners are taking the issue of algorithmic inclusivity seriously.

 

Étiquettes
title
Want to increase sales?
subheading
Tejo can help you boost your client retention, increase bookings and improve product purchases