shadowfacts.net/site/posts/2020-06-05-algorithmic-bias.md

1.4 KiB

metadata.title = "Algorithmic Bias"
metadata.category = "misc"
metadata.date = "2020-06-05 09:55:42 -0400"
metadata.shortDesc = ""

I am subscribed to Marques Brownlee on YouTube. I watch almost every one of his videos. YouTube is smart. It knows this, it recommends me almost all of his videos. But not this one. No matter how many times I refresh the page. No matter how far down the page I scroll. Despite the fact that the video has gotten 2.3 million views in 16 hours, performing better than a number of his recent videos. Despite the fact that it's recommending me videos that are from people I am not subscribed to, videos that are years old, videos that I have watched before, videos that are about politics, videos that are about the ongoing Black Lives Matter protests in the wake of George Floyd's murder.

This is what algorithmic bias looks like. Algorithms are not neutral.1

YouTube thumbnail of an MKBHD video
A screenshot of the thumbnail for a YouTube video from MKBHD titled "Reflecting on the Color of My Skin".

  1. "Algorithm" is a word here used not in the purely computer science sense, but to mean a element of software which operates in a black box, often with a machine learning component, with little or no human supervision, input, or control. ↩︎