My daughter and I had the talk -- about algorithms
Talk to your kids about algorithms, before it's too late.
INT. HOME DEPOT — MORNING
Narrow, harshly lit, poorly stocked “Planters” aisle in city-sized Home Depot.
A father — tanned, stocky, menacing, early forties — scans the limited selection of sure-to-break-within-a-year garden hose nozzles. His six-year-old daughter — nimble, precocious, ferocious — keeps her eyes peeled for yellow clearance tags, per protocol.
After a few moments in the cramped space, the six-year-old, curious — but mostly in the mood to hear her own voice — asks a question.
Dad, did you know there are eight planets in our solar system?
I did know that. In fact, there used to be nine planets in our solar system, but then the geniuses at NASA canceled Pluto. It’s a dwarf planet now.
What does dwarf mean?
It basically means small, but let’s check the official definition. One sec. Okay, according to Merriam-Webster, the word dwarf, in the context of astronomy, means “a celestial object of comparatively small mass or size.” So, a few years back NASA decided Pluto was too small to be a “real” planet and demoted it.
It is sad. When I was little Pluto was super easy to remember because it was the smallest planet and also the one farthest away from the sun.
Pluto is also the name of Mickey Mouse’s dog! Like, you know, Pluto the dog — oh he’s so cute!
How’d you learn about the planets anyway?
We talked about them at school.
Nice. During science?
Nope. One day in class somebody asked Miss W. how many planets there were and then Miss W. asked Google and then Google told us the answer. Eight planets.
I see. And how do you think Google knows how many planets reside in our solar system?
Because Google knows everything!
Okay, listen up, this is important. Google, and tech companies like it, including Facebook, Instagram, Twitter, and TikTok, all rely on computer programs called algorithms. Ever heard that word?
Al-go-rhythm. It’s basically a fancy term that means solving a problem by following a very specific set of directions. Google, for instance, is called a search engine and it uses search algorithms to — ostensibly — help people find information on the internet. Its actual business is selling junk ads online.
Ugh — I hate junk ads!
Me too. Unfortunately, the entire internet ecosystem is subsidized by junk ads. People really don’t like paying for content…
Anyway, with Google, you ask a question — How many planets are in the solar system? — and Google follows a very specific set of rules and instructions — which were designed by computer programmers — to come up with an “answer.”
But! — and this is essential — while searching for the answer, Google doesn’t “think.” Google isn’t a person. Google doesn’t have a brain. Or feelings. Or a moral compass. Google doesn’t understand context. Or history. Or present-day social mores or geopolitics.
Google is just a bunch of computer programs, which are designed to solve very specific problems by following very specific directions. Understand?
Oh, I get it! Google is sort of like R2!
Google’s probably more like C-3PO, but, sure, think of it as a very, very unsophisticated droid from Star Wars.
Now, with all that in mind, what do you think happens when you ask Google a question that doesn’t come with a simple, straightforward answer?
Like, what is the meaning of life? Or, what’s the best movie of all-time? Or, which global power was most responsible for starting World War I?
Or how about this? What if you ask Google a question that does have a simple, straightforward answer, but which people disagree about because that’s just how people are? Like, for example, do vaccines cause autism? Or, do corporate tax cuts lead to increased business investment? What do you think happens then?
Maybe Google gets confused?
It sure does! And when Google gets confused, society collapses. Hashtag twenty-twenty-two.
See, Google doesn’t “understand” how to answer complex, subjective, and/or philosophical questions, so in those cases its algorithms default to showing people things they’re already familiar with, or they already like, such as their favorite news sites, blogs, videos, or social media accounts.
That creates what’s called an “echo chamber,” which in turn perpetuates what’s called “confirmation bias.” That’s when people double-down on their pre-existing beliefs and become less amenable to new information or ideas. Since Google is by far the dominant search engine, anyone who uses the internet is exposed to these types of algorithmic-driven biases.
And if all that wasn’t bad enough, these types of perverse effects go supernova on so-called social networks like Facebook and Twitter. You’ve seen Mommy and I use Twitter before, right?
Is that the place where you turn your stories into junk ads but then nobody clicks on them?
That’s the one.
Now consider this. A few weeks after the horrific massacre at Robb Elementary School in Uvalde, Texas, the hashtag SHALLNOTBEINFRINGED trended on Twitter. It was bad faith, repulsive, disgusting, disheartening, and so much worse. But Twitter’s content-ranking algorithm thought all of it was great and went into overdrive promoting stuff related to it.
See, Twitter’s algorithm didn’t make a moral judgment about whether amplifying those tweets or that hashtag was right or wrong. It just calculated that a large number of people — and bots — “engaged” with that type of content. The algorithm then further promoted that nonsense, which meant more people saw it, then they subsequently liked and re-tweeted it — et cetera, et cetera — which eventually created a deleterious feedback loop of sheer lunacy and utter insanity.
This exact type of process is how we got a bunch of misinformed dipshits to storm the Capitol Building last January, why millions of people think the Covid-19 vaccine has a 5G microchip in it, why Q-Anon exists, and how corporations and universities entrench racial discrimination into their hiring and admissions processes. It’s also why Ron DeSantis will become the president in 2024. The formal scientific term is garbage in, garbage out.
Oh, so you mean, like, if you were to put garbage toppings on your pizza, then you’d be eating garbage pizza?
Exactly! The insidiousness of these garbage algorithms can be subtle too. For example, in Apple News, I got caught up reading way too many articles in The Atlantic about the culture war, and about which of the dumb, orange president’s former cronies would be the latest to turn on him. Because I read so many stupid articles like that, Apple News’ suggestion algorithm said: Well, since you liked those so much, you’ll surely love these!
And thus I’d fallen into the trap. While Russia was preparing to shell Kiev, Apple News was showing me Reddit fan theories about what to expect from the third season of The Mandalorian. Somehow, while reading more than two hours of “news” per day, I had no idea World War III was imminent. Apple’s suggestion algorithms literally made me a less informed global citizen. Not great, huh?
Not great. At. ALL. But Dad, can’t you battle back against the algorithms?
It’s not easy. The most important step is remembering algorithms are all around us and influencing nearly everything we see or do online. Once I realized my news diet had been corrupted, I resubscribed to The Economist. I now force myself to read [almost] every article each week so I have a more informed, holistic view of the world.
Beyond that, we just have to stay vigilant, disciplined, and mentally frosty at all times.
What does vi-juh—
It means we should try to be more like Master Yoda.
Oh, right! I love Master Yoda! I wish more people were like Master Yoda.
Indeed. Alright, let’s finish up and head home. The last episode of Obi-Wan is supposed to be epic. I read about it on Twitter.
Something absurd: I mostly listen to hip hop music (circa 1991-2002), but during the sweaty summer months my inner Florida Man resurfaces and demands classic rock. To get my fix I visited the Classic Rock Essentials playlist in Apple Music. Hendrix. Heart. Floyd. Rush. Perfect.
Springsteen. Petty. Sure, fine — whatever.
Mötley Crüe? Incorrect.
Nirvana? No — make it stop. Soundgarden?! Stone Temple Pilots?!?!?!?!
Look, I’ll willingly traverse my inexorable path to irrelevance and death. But I won’t accept “The Chain” and “Master of Puppets” in the same playlist. Neither should you.
Never leave the boat: During last week’s annual pilgrimage to Joplin, Missouri to visit in-laws, we rented inflatable rafts and, despite the ungodly temperatures, ventured off on a “river float.”
We suffered the following casualties — adult foot (wounded), child’s finger (wounded), straw hat (KIA), tinted safety glasses (MIA), Apple Watch (MIA, presumed KIA), marriage (wounded) — plus, according to the Missouri Department of Public Health, there’s a non-zero chance my brain is infected with a flesh-eating amoeba.
For memories that will last a lifetime, it was all worth it.
Webpage updates: While these posts magically appear in your inbox every week, Field Research also exists as its very own standalone website. You can visit and find the entire archive here.
Two quick updates. First, this series, lovingly called the “The talks,” now exists as its own section (see image below)!
I love this device and will likely write one whenever I’m done explaining something wildly age-inappropriate to one or both of my children. We’ve already discussed nuclear weapons, gentrification, and mass shootings. We’re currently talking about inflation and representative democracy. Stay tuned.
Second, Advanced Sabermetrics and The Invisible Hand have revealed my “Failed novel scenes” aren’t particularly interesting to the vast majority of you lovely subscribers (it’s a failed novel for a reason). In future, instead of spamming your inbox, I’ll publish those pieces directly to my webpage under the shiny new “Fiction” tab. I’ll drop a link whenever a new piece becomes available.
Next Friday: Working up something very on brand.
Thanks for being here. I’m having a blast writing and hope you’re having a blast reading.
Have a great weekend,