A while back we looked into why a Google image search for "unprofessional hair" gave distinctively different results to a search for "professional hair". Despite others, who looked into the same exercise, putting the blame solely on Google's search algorithm for being "racist", a deeper look into the search results revealed something else.

In this specific case, the image search results that Google's algorithm returned had more to do with a lot of the authors of the articles mentioning " unprofessional hair" when talking about black women's hair. Added to that, as you'll read in the article, most of the articles were written by black women. It's important to understand in this case that keywords adjacent or in the caption of the image, not necessarily the "content" of the image, determine which images Google returns as part of search results.

Google image search for "unprofessional hair"

With Artificial Intelligence (along with Machine Learning) being one of the trending topics in 2018, it becomes important to understand how AI makes decisions. At a very basic level, I am simplifying, training data is used to train the "machine" (i.e. software) to learn and make decisions. Now, as with most things, the quality of the training data, or input data, will determine the outputs, decisions the machine makes. The old saying somewhat applies here: Garbage In Garbage Out (GIGO).

This, training data, becomes even more important when we start taking about high risk machine activity such as self driving cars. As Daniel Mwesigwa previously highlighted on iAfrikan, imagine a driverless car that was "trained" on roads in San Francisco trying to navigate Kampala peak hour traffic on a road full of potholes and matatu drivers who don't obey any traffic rules. It thus becomes important that technology companies consider diversity in their AI divisions, not just racial diversity, but diversity in experience, geography/location knowledge, language, etc.

In some cases, as we move to an age where software makes safety (police related) decisions, a bad AI driven decision could be a matter of life and death.


This article first appeared on 14 May 2018 in the iAfrikan Weekly Digest Newsletter, a Pan Afrikan weekly digest of the most important stories of the week which includes insights and analysis on the most topical story of the week.