2801 - Ho Ho Hoh Shit

S02E08 - Xmas Story
"If he catches you after dark, he'll chop off your head and stuff your neck full of toys from his sack of horrors!" - Amy Wong

In the beginning of the 29th Century, the Friendly Robot company will create a robotic Santa Claus, designed to judge us all as naughty or nice, and distribute presents accordingly. If you're anything like me, one question immediately comes to mind when presented with this concept: "fucking why?" I'm gonna go ahead and warn anyone under the age of 10 to stop reading right now, because this discussion is gonna get a little too real for you. Obviously creating a robotic Santa is pointless because Santa already exists and is awesome, but for the sake of argument, let's talk further once the children go to bed.

They gone? Ok, so Santa doesn't exist and Christmas is mostly a manufactured concept of capitalism to sell leftover fall merchandise, tacky decorations, and (until 2200 anyway) pine trees. We're all on board with this knowledge, right? If not, I apologize to any adults who still thought joy was real. So if the concept of a milk-and-cookie-fueled old man jumping down your chimney to deliver unsolicited gifts is just a fairy tail, what in the hell is the point of designing a robot to actually do just that? Where is he getting the presents? Did the Friendly Robot Company also build him a Death Fortress on Neptune complete with slave labor, or did he do that himself? Let's just be blunt here, what was the Company's business angle? Surely the cost of manufacturing and delivering a planet's worth of gifts far outstretches any profits they would gain from...I don't know, merchandising? Clearly my dullard 21st Century mind is too primitive to understand the machinations of the mighty corporate monstrosities of the future.

Anyway, a programming error causes Robo-Santa to judge everyone as naughty, and he goes on a horrifying murder spree every 25th of December. As an aside, I truly enjoy the picture Mr. Groening paints of society and way it has just sort of accepted these deadly rampages after 200 years of enduring them. Citizens of New New York sing gay carols of surviving the onslaught, decorate with armor plating, and huddle together in heartwarming familial fear. For every grim warning Mr. Groening gives us, it's nice to know that he believes in our ability to withstand whatever batshit crazy punishments our future will throw at us. This episode truly speaks to humanity's ability to bond through hard times, and the sentiment is only slightly undercut when one wonders why no one ever bothered to sic the Earth's military on this single pun-spewing murderbot.

Accuracy: TBD

Like many of Futurama's prophecies, we might be right not to take every aspect of Santa Claus the serial killer so literally. Conceptually, Santa represents one of the series' best examples of artificial intelligence gone wrong. Even today, the top minds in the technology space are either barely able to contain their excitement at the prospect of intelligent machines, or ominously warning us of the day we'll all be slaves to a race of metal overlords of our own creation.

Santa's cheeky "programming error" could simply be a shorthand for what might happen if we entrust too much power to computers. Software development can be tricky at the best of times, and it's worth considering that AI is only as good as the mistake-prone humans programming it. Bugs in code are bad enough when it just ruins the formatting on your blog; imagine a bug in your understanding of basic morality.

By abstracting the concept of a rogue AI onto a fantastical character like Santa Claus, Mr. Groening is able to issue us a coded warning in a way that doesn't conflict with our current understanding of AI applications. Our doom may not literally arrive in the form of a TOE Missle, but that doesn't mean we shouldn't be wary when peeking into the sack of potential horrors that is general artificial intelligence. Our aim is true; the dream is to delegate more and more tasks to technology, as we've always done. However, once we pass the point at which our technology can improve and learn without us, we risk losing control of the consequences. People are hard enough to control, let alone a person with far greater thinking power and a "programming error" or two.