On Christmas Day, Netflix is releasing the Black Mirror Christmas Special: White Christmas. The episode, which first aired last year in the, U.K. and tells three chilling tales under a single-frame work (Similar to A Christmas Carol’s formula, but with a much more bleak outcome.) Like all Black Mirror stories, “White Christmas” is a cautionary tale of familiar technology tested to its limits by some of humanity’s most complicated and dark behaviors.

“White Christmas” weaves together the implications of several different technologies, all which seek to manipulate others and alienate ourselves. One scary notion is a real-life blocking action that has a similar effect to “blocking” or unfriending in social media. Social media blocking can be a painfully aggressive personal move that can even lead to retaliation, but it’s hard to imagine how unnerving it could be to be physically blocked by another human.


In the “White Christmas’ reality people can even get legal measures to block a person from their lives permanently, which can also apply to offspring. The legal system also uses blocks to obscure the world from people marked as sex offenders, which not only leaves them alone in the world but stains their blurred form a scarlet color. In this reality, you can’t ever be unmarked.

Another scary implication of the White Christmas reality is the ability to copy a person’s consciousness onto data and manipulate that data for information. It seems like a cold, utilitarian move, but it’s made worse by the fact that the copied “you” thinks it’s a real person just like you. Before Jon Hamm’s character finds himself in the snowy cottage with what appears to be his prison mate, he was working a day job “breaking” these copies people made of themselves so they would be happy being slaves for their original person. They wake up feeling like they are that person and then get tortured using time-warp perception and Jon Hamm’s psychological jiu jitsu until they’re grateful to be making toast and setting the thermostat for their “real” self. It’s a cruel scenario and raises some very real questions about A.I. ethics.

The implications of infinite time here are even more terrifying than permanent blocking. We can’t endure too much tediousness. Making us sit in a white room for six months with nothing to do but churn inside our own heads seems beyond torture and horror. The fact that this is happening to a digital consciousness makes it worse, because there is no corporal relief for her. She doens’t have the escape of death, or even sleep.

If a program, or bit of data, feels emotion and pain the same way we do, do we owe them the same degree of dignity and compassion as other humans? Of course, considering this, we must be honest about the level of dignity and compassion we actually give other humans in the real life meatspace. Matt Trent (Hamm)’s hobby explores this a question a bit. To get his kicks, Trent helps other men get ladies by using technology that sees through their eyes and lets him speak into his head. Trent, the ultimate wingman pick-up artist, orchestrates the man’s every word and every action, to perfectly play the target lady while a bevy of other dudes watch the spectacle like a sports match. Trent’s last seduction coaching, however, goes horribly wrong when the girl, who hears voices that make her want to kill herself, catches her new friend talking to Trent. Thinking they’re on the same page about things, she poisons him and herself while Trent and his friends watch.

Of course, this whole conversation has just been Trent doing what he does best: using his power of persuasion to get the information he needs. Part of his tactics are to reveal a bit about himself to get the other person to open up, but of course, he’s said too much. He’s admitted to his unseemly side-hobby while being watched by the police, and he’s immediately punished by an exile in plain sight. Since blocking silences a person, but doesn’t prevent them from doing physical harm, it doesn’t seem like a very safe practice for anyone. Someone who’s blocked in this manner could do a lot of damage. Everything has been taken from him. On the other hand, since he’s marked red and has no face, he’s more in danger than the other people he encounters. He could be an easy target: a marked person with no identity beyond the stain of his crime. At the end, it looks like the seller on the street with with the snow globe may, in fact, have some dark intentions for this lonely stranger.

The snow globe also ties into the fate for the copied consciousness of the other prisoner. While the real man is confined in a prison cell, his copy is stuck in snow globe-type time prison that’s perpetually Christmas Day, the day of his crime, for at least several million years. He’s in the house where he murdered his deceased ex-lover’s father, a crime of passion he committed with a snow globe when he finally realized the child she had wasn’t his.

His actions led the scared little girl to wander out into the snow where she perished, and his million years of hell include a full view of her small body lying in the snow while the radio plays “I Wish It Could Be Christmas Every Day” on loop. Since we only live for 100 years at most, the idea of being conscious for millions is inconceivable. The threat of an infinite hell is such a persuavive manipulation tool, because although we can’t really fathom what that really must be like, it is already the worst thing we can conceive of. In this case, the hell isn’t even being given to a living person, but this bit of data trapped in an awful remembrance. Again, there are questions raised about the capacity of our cruelty for a consciousness we consider other, or lesser, than our own. This is where our true capacity for evil actions springs: when we see other people as not as fully human as we are, we feel how we treat them doesn’t matter at all.

The punishments and justice in Black Mirror are always swift, harsh, and with zero mercy. While the Twilight Zone played around with supernatural elements and often ludicrous sci-fi scenarios, Black Mirror trades in the horror of very familiar technology, often only taken a few steps beyond where it already is. The trouble is never the technology, of course, it’s how we use it. The devices don’t have the capacity to be monsters, we do.