This essay explains the origins of this site, featuring a last-ditch GOTV effort, an overzealous Facebook algorithm, and the dark future that the social network may steer us toward.
We’re all in a hurry, so here’s the TL;DR.

My aim with this is essay is three­fold:

  • To under­mine Facebook’s ideal­is­tic appeals to free speech, which it routinely deploys to absolve itself of respon­si­bil­ity for just about every­thing.
  • To high­light the funda­men­tal and recur­ring weak­nesses in Mark Zuckerberg’s logic and lead­er­ship, and why they should scare us.
  • To warn of the murky outlook for Facebook’s core prod­uct — and why that should scare us even more.

The point is not that my dinky lunge into polit­i­cal activism would’ve changed the trajec­tory of the elec­tion (it prob­a­bly would’ve flopped on its own accord). It’s that I tried to engage in our nation’s polit­i­cal discourse during a pivotal moment in its history, and Mark Zucker­berg stole my mojo.

So now I am trolling him, as any patriot would.

All of this is true.

Part I: The Bedwet­ter

When I was little I pissed the bed.

This was normal until it wasn’t, so even­tu­ally they strapped a little blue box next to my head with jelly­fish elec­trodes strung to my pajamma crotch. The blue box wailed when­ever I fell into a deep enough slum­ber to heed nature’s call. It explains a lot.

Many years later I would learn the extent of this trauma — thanks to its repeated and sado­masochis­tic trig­ger­ing by the polit­i­cal podcast Keep­ing it 1600. The show featured several former Obama speech­writ­ers who, amid biting commen­tary and self-indul­gent fart sniff­ing, regu­larly chas­tised listen­ers horri­fied by the loom­ing threat of a Trump pres­i­dency — dubbing us “bedwet­ters”.

They’re prob­a­bly right,” I thought, remem­ber­ing damper times. But then the Comey letter hit and the blue box started wail­ing, until it drove me crazy enough to act.

The idea was simple: Win, then party. Remind young people that when Trump goes down, a lot of them will too. The cathar­sis of his loss will propel them to new highs and mellow vibes. You get it.

Five days and consid­er­able cannabis later millennial.website1 was born: a blatant grab for a few youth­ful heart­strings, in the hopes that one might have a large enough Insta­gram follow­ing to propel it to viral great­ness and save Amer­ica and maybe I’d wind up karaok­ing Kanye with Taylor Swift.

Narra­tor: He did not.

The hard stop for launch was the morn­ing before the elec­tion, when the winds told me there’d be a wave to catch. The site was done. The copy, what­ever. Friends who get big bucks as social media consul­tants gave it the thumbs up — then again, they are also nice. One last joint, a prayer to the gods —

And then:

This is bad. Worse than you think, because of the struc­tural sociopa­thy baked into Face­book Support — more on that later. All attempts to share the site on Face­book and Insta­gram are being met with alarm­ing secu­rity errors; it’s DOA. Evasive maneu­vers ensue.

A direct message to a VP at Insta­gram gets an imme­di­ate response (“Hey I’m in London at an event”). Fair enough, seeing as we have not spoken in five years. Other backchan­nels prove simi­larly rusty. There is promise, but there is also bureau­cracy.

Hours pass. Morn­ing fades to after­noon, the work­day ends on the east coast. The wave I antic­i­pated is really a tsunami; I prob­a­bly would’ve been crushed anyway. But it is going, going, gone.

But fuck that — that damned blue box is roar­ing — so I revert to a previ­ous version of myself and write a threat­en­ing email to Face­book PR.

Two hours later the site is no longer consid­ered a secu­rity threat; A D.C.-based Face­book PR rep checks in to confirm as much.

Alas: it is already 10 PM on the east coast. In the mean­time, attempts to share the site on other plat­forms have gone nowhere, and I need a change of pants.

Part II: Free­dom of Screech

It is here that I’ll make a conces­sion few would: the fact that millennial.website was blocked on Novem­ber 7, 2016 is obnox­ious, but under­stand­able.

The expla­na­tion is simple. The ‘.website’ on which you currently stand is a rela­tive newcomer to the domain markets, making its debut in 2014. Spam­mers have abused these domains — an op-ed last summer bemoans .xyz, .website, and the infa­mous .museum as hotbeds for nefar­i­ous actors — and Occam’s Razor suggests that Facebook’s vaunted algo­rithms learned to treat them with a heavy hand.

(Of course, Face­book doesn’t tell you anything when it blocks you, so this is necce­sar­ily conjec­ture.)

It’s impor­tant to recog­nize this point, because it is the sort of logic that Face­book employ­ees will imme­di­ately find refuge in. After all, Face­book is deal­ing with immense volumes of content, much of it shared by sophis­ti­cated bad actors; the amount of vitriol that Face­book success­fully blocks is surely stag­ger­ing. There will inevitably be false posi­tives, and alas, this was one of them.

So: point conceded. Algo­rithms are going to screw up — they are coded by humans, after all. But given that these auto­mated errors are a given, so too should be the mech­a­nisms to remedy them. You know, like being able to tell some­one when the robots go off the rails.

Now we should fast forward to the present, and remem­ber how deeply Mark Zucker­berg and Sheryl Sand­berg believe in free speech. It is the bedrock from which Face­book justi­fies its lais­sez faire approach to the content it distrib­utes — and the reason why it mini­mizes its own culpa­bil­ity in the epidemic of misin­for­ma­tion. “When you cut off speech for one person, you cut off speech for all people,” says Sand­berg. “Free­dom means you don’t have to ask for permis­sion first,” says Zucker­berg.

Which is a good thing, because as of Elec­tion Day 2016 there was no way to ask for permis­sion. Here’s the inscrip­tion on the brick wall I ran into:

Trans­la­tion: go zuck your­self.

This sort of chilled, cloy­ing copy is a Sili­con Valley special­ity. It is also in obvi­ous and complete contrast to the enlight­ened narra­tive Zucker­berg and Sand­berg have adopted as their shield in the ongo­ing discus­sion about Facebook’s role in polic­ing content. If either of them believed in free­dom of expres­sion to the extent they claim, they’d have safe­guards to ensure people aren’t arbi­trar­ily silenced — which certainly entails “review­ing indi­vid­ual reports”.

Doing so would doubt­less entail consid­er­able resources (like hiring people to review reports in a timely manner), but Face­book is mint­ing billions in profit every quar­ter. They have the means, but their convic­tions are second-hand. Better luck next time.

(There are myriad other exam­ples of Facebook’s hypoc­racy when it comes to censor­ship poli­cies. Many of these are shaded by laws and norms, but this is more binary. Either you give people a way to appeal what your mind­less, buggy soft­ware does, or you don’t.)

It is also imper­a­tive to recog­nize that for the vast major­ity of people who are unjustly smoth­ered, there are no backchan­nels.

Part III: The Gears of Denial

As Mark Zucker­berg stood before Congress —

Oh, right. He’s a coward.

As Facebook’s sleek lawyer stood before Congress this week to provide focus-grouped answers to ques­tions about Russia, he pointed to the many ways they have already recti­fied the situ­a­tion. Polit­i­cal ad trans­parency is going way up, staffers are being hired by the thou­sands, and it will make what­ever changes neces­sary to ensure this never happens again.

Because Face­book doesn’t really care about polit­i­cal ads. They’re nice to have, sure — poli­tics vali­date the company’s global influ­ence and are predictable boons to engage­ment. But the combined sum of Donald Trump and Hillary Clinton’s campaigns totaled $81 million. Last quar­ter alone, Face­book pulled in $10.3 billion in revenue. This is all a drop in the bucket.

Face­book just wants this night­mare to be over so we go back to riding the stock price and Zuckerberg’s dopey visions. Just tell it what to do and move on.

We must not let this happen. The polit­i­cal ads are not the prob­lem — they are merely symp­toms of it. Thank­fully, the media is increas­ingly focus­ing on the truth: that the root of Facebook’s dark­ness lies far deeper. That the real threat lies in ‘organic’ content; the sort that Face­book rewards and syndi­cates to the masses for free. That the game Face­book calls ‘social’ is in fact an exer­cise in silo­ing and ampli­fy­ing our lesser angels, to the detri­ment of soci­ety and our own psyches.

Face­book does all this to keep us hooked. And when we’re hooked, it can show us ads. Billions and billions of dollars worth.

If Congress wants to make a dent, it needs to scru­ti­nize Facebook’s busi­ness model, and the way that busi­ness model guides Mark Zuckerberg’s deci­sions. It needs to do this because Face­book has proven itself either unable or unwill­ing to recog­nize how its actions are warped by its profit motive — and because this tension is only going to get worse.

A telling case study can be found in how Face­book handled the Trend­ing Topics deba­cle. 2

In early 2016 a thinly-sourced arti­cle in Gizmodo reported that Trend­ing Topics were being selected and edited by liberal-lean­ing humans — not the algo­rithms that Face­book had previ­ously claimed to employ. Face­book freaked, begged conser­v­a­tives for forgive­ness, fired the humans, switched to its vaunted algo­rithms, and those algo­rithms promptly distrib­uted obvi­ously-fake stories to millions of people. Oops.

Tough calls are tough calls, and some­times there are no good solu­tions. This was not one of those times. Facebook’s choice between “humans” and “algo­rithms” — as char­ac­ter­ized by the press — was a false one. Trend­ing Topics only debuted in 2014. They are hardly core to Facebook’s mission, and the company could’ve easily swapped the prod­uct for a widget that occu­pied the same space not-so-long ago. It’s not as if head­line news is tough to come by.

But Face­book had other concerns.

In April 2016 The Infor­ma­tion reported that “orig­i­nal broad­cast shar­ing” on News Feed had declined by double-digits over the preced­ing year. This anodyne term belies its profound impor­tance: it is Facebook’s bread-and-butter, compris­ing the baby photos, humble­brags, and emotional missives that define the company. It is our friends that keep us stuck to Zuck; when they leave, so do we — which puts Facebook’s immensely lucra­tive adver­tis­ing busi­ness at risk.

Put another way: Facebook’s decline in orig­i­nal shar­ing is akin to an all-star athlete suffer­ing chest pains.

As such, these vital signs — collec­tively referred to as “engage­ment” — shape the lens through which Mark Zucker­berg surveys his busi­ness. Any change that hurts engage­ment is unlikely to be seen as a prior­ity, any new prod­uct is eval­u­ated first-and-fore­most by its social fecun­dity. This is not a recipe for objec­tive deci­sion making, but it is at least a predictable one.

Trend­ing Topics were intro­duced because they are foun­tains of engage­ment: a person­al­ized tabloid that guar­an­tees you always have some­thing new to get excited or furi­ous about (easily shared, of course) when­ever you visit Face­book. And so it was no surprise that Face­book wasted little time in imple­ment­ing a new, algo­rith­mi­cally-edited version of the feature in lieu of humans — only to imme­di­ately screw the pooch.

Two days after dismiss­ing the editors, a fake news story about Megyn Kelly being fired by Fox News made the Trend­ing list. Next, a 9/11 conspir­acy theory trended. At least five fake stories were promoted by Facebook’s Trend­ing algo­rithm during a recent three-week period analyzed by the Wash­ing­ton Post.

After that, the 2008 conspir­acy post [“Hacked Obama Email Reveals He & Bush Rigged 2008 Elec­tion”] trended.” — Buzzfeed

Soon there­after, Face­book VP Fidji Simo would explain that they switched to algo­rithms to facil­i­tate scal­ing the prod­uct glob­ally, while conced­ing the feature “isn’t as good as we want it to be right now.” In the same inter­view, Simo says that Face­book had seen success letting people flag stories on News Feed as ‘mislead­ing’ — but that this func­tion­al­ity was not yet included as part of the new Trend­ing Topics. Remem­ber: this was in the months and weeks imme­di­ately preced­ing the elec­tion, when the stakes couldn’t have been higher.

The ques­tion is: why?

Why not shelve the feature until it’s baked? Why the rush to deploy it glob­ally, where — as Buzzfeed put it — “fail­ures will poten­tially occur at a scale unheard of in the history of human commu­ni­ca­tion”? Why are they in such a hurry to do things they have to imme­di­ately apol­o­gize for?

The answer is almost too mundane to warrant mention: it’s just busi­ness. If it juices engage­ment, it stays. Facebook’s PR team conjures a narra­tive about keep­ing people informed or connected or what­ever, and we are left to deal with the collat­eral damage.

But Mark Zucker­berg doesn’t see it that way. To him, the tension between Facebook’s ideals and busi­ness objec­tives is itself a myth. From a recent sit-down with Bloomberg:

Through­out the inter­view, he seems irri­tated that his actions could be viewed as anything other than expan­sive benev­o­lence.

We’re in a pretty unique posi­tion, and we want to do the most good we can,” he says of Face­book. “There’s this myth in the world that busi­ness inter­ests are not aligned with people’s inter­ests. And I think more of the time than people want to admit, that’s not true. I think that they are pretty aligned3. ”

We should take him at his word — and run for the hills.

Part IV: Virtu­ally Zucked

Face­book is sort of screwed. I’m not talk­ing about Russia, or Trend­ing Topics, or even Mark Zuckerberg’s naive delu­sions. I mean: if you resur­rected Steve Jobs and set him in the pilot seat, there’s a decent chance he’d throw his hands up and say, “burn it down.”

Because it doesn’t have anywhere to go. Zuckerberg’s grand new vision is to turn Face­book into a garden of vibrant online commu­ni­ties — and there’s some­thing to that. Our soci­ety is indeed want­ing for better social struc­tures and more excuses to come together. The prob­lem is that neither Zucker­berg nor the company he leads has a knack for the kind of funda­men­tal rein­ven­tion this entails, to say noth­ing of its abysmal ethi­cal track record.

Many of Facebook’s biggest prod­ucts have been total flops (Paper, Home, and the laugh­ably over­hyped Messen­ger Bots come to mind). And the Groups initia­tive sounds more desper­ate than prescient: the team was “mostly ignored” until Zuckerberg’s epiphany came in the after­math of the elec­tion. Already, employ­ees are concerned:

He was promot­ing Face­book Groups, a prod­uct that millions of people on Face­book used to talk about shared inter­ests, debate, discuss and maybe debate some more.

This type of activ­ity, he believed, was one of the keys to his sprawl­ing company’s future. The goal of Face­book, he told his audi­ence, which included many Groups lead­ers, was to “give people the power to build commu­nity and bring the world closer together.”

Inside Mr. Zuckerberg’s company, however, there was already grow­ing concern among employ­ees that some of that content was having the oppo­site effect.” — NYT

In the mean time Face­book is still utterly reliant on the algo­rith­mic slot machine that is News Feed — which is what got us here. Facebook’s busi­ness model revolves around display­ing content so titil­lat­ing that it is addic­tive in every sense of the word, and this trans­lates to affirm­ing our pre-exist­ing beliefs (which foments polar­iza­tion) and send­ing us careen­ing on emotional roller­coast­ers, plus some baby photos. What’s worse: the algo­rithms that steer these dynam­ics are entirely opaque with zero account­abil­ity.4

And then there is Virtual Real­ity.

VR remains so under­whelm­ing that it is hard to view with concern — aside from its tendency to induce nausea — and plenty of tech lumi­nar­ies are quick to dismiss the dystopian futures portended by decades of science fiction. But VR is Mark Zuckerberg’s next big bet, which is reason enough to worry. Early next year the company will begin sell­ing a stand­alone head­set for $200; a frac­tion of the compe­ti­tion, and a price designed to bring VR to a scale that Wall Street cares about.

Any tech­nol­ogy has its down­sides, and VR is espe­cially precar­i­ous. It will likely be the most addic­tive tech­nol­ogy ever built, with immense poten­tial for abuse. And once again, Zucker­berg is utterly unpre­pared. During his keynote presen­ta­tion at this year’s Game Devel­op­ers Confer­ence, MMO pioneer Raph Koster shared an alarm­ing exchange he had with Facebook’s CEO.

Koster: “What do you think are the social and ethi­cal impli­ca­tions of social virtual real­ity and connected augmented real­ity?

Zucker­berg: “What ethi­cal impli­ca­tions?”

You don’t have to envi­sion a world like The Matrix or Ready Player One to see how this can spiral into dark­ness5. Consider a more mundane scenario, a few years out, as it becomes clear that Face­book has lost its momen­tum and is strug­gling to repeat its success in the devel­op­ing world.

What happens when Face­book is grow­ing desper­ate to offset a declin­ing News Feed, and comes to see the hundred million people jacked into its head­sets as a resource to be tapped? Is there reason to believe it will show restraint in the abun­dance of adver­tis­ing it immerses us in? Will it facil­i­tate the exploita­tion of the most heav­ily addicted — as it did in the glory days of FarmVille — by induc­ing them to spend endless hours wander­ing its virtual hall­ways? Are we sure we want to give such a high-fidelity link to our neocor­texes to a man who only needed two dimen­sions to under­mine democ­racy?

And finally: what checks are there against Mark Zuckerberg’s infa­mous “ruth­less survival instinct”?

That is the scari­est thing about all this. We have yet to see Mark Zucker­berg when his company is truly in trou­ble. Face­book has never had a viable competi­tor; only maybe-some­days, and those days are over. This week’s Congres­sional testi­mony repre­sents its gravest threat ever, and Zucker­berg did not even bother to show up.

Face­book has hurt us simply by winning. What happens when it starts to lose?

Part V: Epilogue

An essay like this one is about as effec­tive as chimps fling­ing poo at their zoo handlers — and simi­larly reward­ing. But do not mistake it for progress. Real change means forc­ing Face­book to evolve beyond PR-driven band-aids. It means giving researchers, lawmak­ers, and citi­zens a chance — to under­stand how their minds are being shaped by these algo­rithms, and to have a say in those dynam­ics. It means a funda­men­tal change in Facebook’s inter­nal culture, which is hard to imag­ine under its current lead­er­ship.

But that stuff’s hard. So why not buy a vintage soft T-Shirt?


sizing info

Oh, and that GOTV thing? Maybe it worked a little bit.

Turns out Face­book ads work pretty well when they don’t perceive you as a secu­rity threat (there’s a reason Face­book rakes in so much money). In fact, they worked well enough to wake the trolls — who tried reset­ting my Face­book pass­word twenty times in one hour on elec­tion day. Funny what gets by those secu­rity algo­rithms.

All told the ads reached around 12,350 people in Penn­syl­va­nia, Florida, North Carolina, and Nevada.


Further Read­ing

Time Well Spent — This non-profit move­ment, led by former Google ethi­cist Tris­tan Harris, is push­ing the tech­nol­ogy indus­try toward prac­tices and busi­ness models that are not in constant battle with the well­be­ing of soci­ety and our mental health. Even­tu­ally Apple will real­ize that it is uniquely posi­tioned to capi­tal­ize on this (it sells hard­ware, not our minds), and other firms will be forced to compete on their prod­ucts’ human­ity, not just megapix­els. 6 The key is to force these firms to actu­ally change, not just come up with aspi­ra­tional TV commer­cials.

Profes­sor Zeynep Tufekci — A soci­ol­o­gist who is beyond tech-savvy; she can explain these algo­rithms and their pitfalls better than the people build­ing them. Watch this TED talk, follow her on Twit­ter, and read this exchange she had with a Face­book exec­u­tive on the dynam­ics of News Feed.

The Atten­tion Merchants / Tim Wu — Wu is a law profes­sor at Colum­bia; his book charts the rise of modern adver­tis­ing (which is not as old as you think it is, and has unnerv­ing roots in war propa­ganda) and the dynam­ics of mass media up to the present day. Don’t miss his acad­e­mic paper on how a new approach to anti-trust could be applied to compa­nies like Face­book. Also see Wu’s advice to Face­book this week in the New York Times, where he encour­ages Face­book to become a Public Bene­fit Corpo­ra­tion.

You Are the Prod­uct — John Lanches­ter for The London Review of Books. A sweep­ing take­down of Face­book, more sharply writ­ten than any tech reporter could ever muster (because they have to stay on speak­ing terms with Face­book PR).


Jason Kincaid covered Face­book as a senior writer at TechCrunch from 2008-12. He coau­thored a book about the firm, wrote the first arti­cle detail­ing the algo­rithms behind Face­book News Feed, and was once punked by Facebook’s engi­neers. He played himself on HBO’s Sili­con Valley and is currently dining on quail.

  1. This link goes to usa.techf.wpengine.com, which is a clone of this site as it stood on Elec­tion Day 2016.
  2. The feature is neither inno­v­a­tive nor impres­sive: just a hand­ful newsy links that people are inclined to click on, no matter how dull their friends are.It resides in some of the most valu­able real estate on the inter­net — every Face­book home­page — so every­one sees it.
  3. I am reminded of a passage from Kurt Vonnegut’s Mother Night (condensed for clarity).“I have never seen a more sublime demon­stra­tion of the total­i­tar­ian mind, a mind which might be likened unto a system of gears whose teeth have been filed off at random. The dismay­ing thing about the clas­sic total­i­tar­ian mind is that any given gear, though muti­lated, will have at its circum­fer­ence unbro­ken sequences of teeth that are immac­u­lately main­tained, that are exquis­itely machined. The miss­ing teeth, of course, are simple, obvi­ous truths, truths avail­able and compre­hen­si­ble even to ten-year-olds, in most cases. The will­ful filing off of gear teeth, the will­ful doing with­out certain obvi­ous pieces of infor­ma­tion—
    That is the clos­est I can come to explain­ing the legions, the nations of lunatics I’ve seen in my time.”

  4. Even this corro­sive recipe is losing its potency, as evidenced by the afore­men­tioned decline in orig­i­nal shar­ing. Face­book is now test­ing a bifur­cated News Feed that places renewed empha­sis on the content shared by our friends, rather than the media orga­ni­za­tions we’ve subscribed to. That sounds promis­ing at first blush, but it may compound Facebook’s silo effect, further strat­i­fy­ing soci­ety into groups of people who agree with each other.The fact that Face­book is test­ing this system at all — after spend­ing years court­ing those very media orga­ni­za­tions, who are now terri­fied — is indica­tive of the grav­ity of the situ­a­tion. Again: it is our friends that keep us glued to Face­book. Those chest pains must be flar­ing.
  5. But maybe you should: the legendary program­mer John Carmack, who is CTO of Facebook’s VR company Oculus, is on record saying,“We prob­a­bly are head­ing more on the path of The Matrix than we are to Mars.”
  6. Predic­tion: Andy Rubin’s company Essen­tial will try to do this for Android.

Facebook Blocked Millennial GOTV Effort

Shopping cart

Subtotal
Shipping and discount codes are added at checkout.
Checkout