shadowfacts.net/site/posts/2020-06-05-algorithmic-bias.md

18 lines
1.5 KiB
Markdown
Raw Permalink Normal View History

2020-06-05 14:22:35 +00:00
```
metadata.title = "Algorithmic Bias"
metadata.tags = ["misc", "social media"]
2020-06-05 14:22:35 +00:00
metadata.date = "2020-06-05 09:55:42 -0400"
metadata.shortDesc = ""
```
I am subscribed to Marques Brownlee on YouTube. I watch almost every one of his videos. YouTube is smart. It knows this, it recommends me almost all of his videos. But not this one. No matter how many times I refresh the page. No matter how far down the page I scroll. Despite the fact that the video has gotten 2.3 million views in 16 hours, performing better than a number of his recent videos. Despite the fact that it's recommending me videos that are from people I am not subscribed to, videos that are years old, videos that I have watched before, videos that are about politics, videos that are about the ongoing Black Lives Matter protests in the wake of George Floyd's murder.
This is what algorithmic bias looks like. **Algorithms are not neutral.**[^1]
<figure>
<img src="<%= metadata.permalink %>/youtube_thumb.png" alt="YouTube thumbnail of an MKBHD video">
2020-06-05 15:24:34 +00:00
<figcaption>A screenshot of the thumbnail for a YouTube video from MKBHD titled "<a href="https://www.youtube.com/watch?v=o-_WXXVye3Y" data-no-link-decoration>Reflecting on the Color of My Skin</a>".</figcaption>
2020-06-05 14:22:35 +00:00
</figure>
[^1]: "Algorithm" is a word here used not in the purely computer science sense, but to mean a element of software which operates in a black box, often with a machine learning component, with little or no human supervision, input, or control.