Is Facebook a structural threat to free society? I make the argument.
Facebook is the sixth-largest company in the world by market cap. It is approaching two billion users across its platforms, and user growth remains steady. It collects an unprecedented amount of data on those billions of users.
It is possible, if not probable, that Mark Zuckerberg’s company will become the largest in the world. Facebook’s share structure reserves exclusive control of voting power for Zuck, so he will maintain control of the behemoth. It is also not out of the question that as Facebook grows, Zuckerberg will become the world’s wealthiest individual.
As Facebook grows, so will its ownership of the social graph and our digital selves. Systemic risk is highest in centralized systems. Extrapolating trends, I consider it possible, if not probable, that Facebook will become a systemic risk center for free society. The argument goes:
- Facebook engages in comprehensive and growing data collection on its billions of users
- This data allows for exponentially greater manipulation of human beings and their realities than ever seen before
- Facebook is building towards human simulation and ownership
- Extrapolating trends, Facebook will create unprecedented centralization of power and influence in the hands of an individual
- Without a change of course, we are enabling a structural threat to free society, and potentially worse
Facebook already knows more about its users than they know about themselves. Facebook’s ability to collect data is rapidly growing, while its willingness to do so remains absolute. Facebook’s enormous, comprehensive, and individual-level data is its current product, and will be the foundation for its future ambitions.
Facebook receives highly valuable personal data voluntarily submitted at signup. Users willingly divulge:
- Real Names
- Employment (and income)
- Location and residence
- Where they travel to
- Social links (family, partners, friends, acquaintances)
- Hobbies & activities
- Preferences (“likes”) for brands, products, political parties, foods, entertainment, celebrities, etc.
This forms the core of the Facebook profile.
Facebook’s ubiquity enables it to collect an unrivalled amount of behavioral data to augment the personal profile. It tracks, or theoretically can track:
- Every site you visit:
The Facebook “like” button is on practically every page. Even this one. If you’re logged in to Facebook, which most users are, Facebook can collect this data.
- Every purchase you make:
Facebook can work with third-party data providers to know your purchase history. Matching datasets is easy with real names. Facebook also tracks sales made through its own platform and purchases almost made but not completed. You might also have noticed a “share your purchase with your friends” widget becoming more common on order confirmation pages – this can also be tracked.
- Locations you visit:
In addition to self-reported residence and travel locations, Facebook augments and verifies location data using location tracking in its various mobile apps.
Facebook does not need you to interact with it to collect data on you. And when you do interact with it, it can infer even more from what you provide than you think possible.
- “Shadow” profiles:
Facebook builds shadow profiles for users not in the system, storing their names, addresses, contact information, and more. This is provided by existing users through the “find my friends” system, and can be augmented through web crawling.
- Your image:
Facebook scans the billions of image uploads it receives to create facial and bodily profiles for all users. This is how Facebook suggests friends to tag in uploaded images. But it is also used to augment shadow profiles – so Facebook knows the personal info and biometric templates for users not even signed up. Facebook can also scan photos of public places – taken by tourists, for example – and identify its users and “shadow” users in the background. Photos contain geolocation information, so those background individuals’ location is tracked without their consent. Neural networks (think Google’s powerful image recognition software) can also infer the location of a photograph even if there is no location metadata.
- Your emotions:
Natural language processing allows Facebook to understand your emotions. Facebook understands the emotions expressed in what you type as statuses, and in messages via Messenger [edit: WhatsApp is encrypted]. They even give you the option of telling them your emotion directly as you post a status, refining their algorithm. They also know the sentiment of content you engage with, using the same algorithms. So they know if you engage with more positive or negative content, and also how each of those content types makes you feel based on your response. The “like” button was redesigned as “reactions”. By giving you more opportunity to express your emotions, Facebook has also created more opportunity to track them.The same neural networks that identify your face in every photo can also determine your emotional state as the picture was taken.
I don’t want to hammer the data point too much, but it is important to show just how much data Facebook has. If it interests you, privacy advocates have written thousands of words on the subject.
The important takeaway is this: Facebook has access to so much data about individuals that it’s beyond human comprehension. As AI technology improves, the ability to extract this information will only increase.
Manipulation Of People And Their Realities
With access to such a large dataset, Facebook has a unique opportunity to manipulate its users and their realities. This enables both a deeper understanding of its users (more data to feed the machine), and the power that comes with influence.
Remember that Facebook feeds are not chronological. Facebook decides what posts you see, and who sees your posts.
Research Into The Human Psyche
I’d like you to seriously consider the idea that Facebook has a greater ability to understand the human psyche than every psychologist, philosopher, cognitive scientist, and behavioral economist in human history combined. If this shocks you, allow me to explain.
Academic and practical work on the human psyche tends to have sample sizes measured in the hundreds. I’ve read my fair share of studies, and I don’t think I’ve seen an n larger than a thousand. Ten thousand at a stretch. On top of that, you have the usual sample selection problems. Study subjects tend to be desperate – for class credit (university students) or money (the rest of them). Most people aren’t study subjects and never will be.
Most people are Facebook users, though. At least the wealthy ones, on a global scale. And an n of 2 billion and growing is literally orders of magnitude larger than any psychological study, ever.
Facebook knows this, which is why it has Facebook Research. This division is the most powerful psychologial experimentation center in human history. It has 2 billion study subjects and more data to understand those subjects than most researchers can dream of.
Manipulation Of People For Profit
Facebook needs users to engage with the News Feed to sell ads. With Facebook’s current motivations, there is a hundred-billion-dollar-plus incentive to keep users engaging with the feed. Facebook manipulates its users’ psyches to keep them engaged, driving profits.
What is the machine zone? It’s a rhythm. It’s a response to a fine-tuned feedback loop. It’s a powerful space-time distortion. You hit a button. Something happens. You hit it again. Something similar, but not exactly the same happens. Maybe you win, maybe you don’t. Repeat. Repeat. Repeat. Repeat. Repeat. It’s the pleasure of the repeat, the security of the loop.
That glazed-over look a grandma has at a Vegas slot machine is the same look Facebook chases in its users scrolling the feed. And it’s created through manipulation.
Take a hypothetical Facebook user, Jimmy. Jimmy tends to scroll Facebook on his phone for 5 minutes, then close the app. Facebook knows that Jimmy likes posts from his friend Steve, because he tends to leave positive comments on them. Next time Jimmy reaches the 5-minute mark, Facebook shows him a post from Steve. Jimmy leaves a comment, scrolls for a bit longer, then quits the app at the 8-minute mark. Facebook also knows that Jimmy is crushing on Jenny – he tends to linger whenever she shows up on his Instagram feed, even if he doesn’t like or comment. Next time Jimmy’s at the 8-minute mark, Facebook shows him a post from Jenny. This pattern continues, and Jimmy’s Facebook use grows from 5 minutes a day to 15.
Facebook Research has learned, hypothetically, that you can only keep people in the Machine Zone with positive content for so long. So the algorithm starts to use negative content too, now and then. Jimmy hates Chad, and Facebook knows this because Jimmy says so in his Messenger conversations. So nowadays, when Jimmy reaches the 15-minute mark, Facebook will show him a picture of Chad doing something fun. The rage drives Jimmy to keep scrolling until he gets that dopamine hit from a Steve or Jenny post.
Now Facebook is growing Jimmy’s time spent every day, and their ad dollars grow with it. All through manipulation of Jimmy’s psyche.
Manipulation Of People For Research
Facebook’s manipulation isn’t done only to increase time spent and profit. Facebook Research is also able to manipulate people to understand the human psyche.
We looked at how Facebook tracks emotion. A controversial 2013 paper by Facebook researchers manipulated the emotional valence of users’ newsfeeds, to see how it would affect the emotional states of those users. The researchers found:
When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.
Again, remember that Facebook chooses what to show you. They know the emotional content of posts on your feed, and how your emotions react to them. They can, and have, manipulated users accordingly.
How else could Facebook manipulate you for research? Let’s take our hypothetical Facebook users from before: Jimmy, his friend Steve, and his crush Jenny. Is there anything Facebook could show Jimmy and Steve that could cause a rift between them? Perhaps they could stop showing Jimmy and Steve each others’ posts. Or only show negative ones. Is there anything Facebook could do to bring Jimmy and Jenny into a relationship? Perhaps they could only show positive images of Jimmy on Jenny’s feed, to give him social proof and the appearance of status.
Even if Facebook doesn’t directly intervene to cause such outcomes, they have all the data required to find patterns in other friends who do split up, and other boy-girl crushes that do end up in a relationship. You’ll find connections very quickly with 2 billion people to monitor.
Manipulation Of People’s Actions
Facebook has the power not only to manipulate your emotions, but how you act. I’m not claiming they use it (yet), but the power exists.
We are social animals and love approval from others. Every like, comment, and reaction causes a little dopamine rush in our heads. Social media engagements form a feedback mechanism in our neurological pathways: we’re more likely to continue posting things that get us engagements, and less likely to post those things that don’t.
Facebook chooses who sees your posts. If you post something that they don’t like, they can hide it from other users’ feeds. If you post something they do like, they’ll spread it across many feeds, and you’ll get more likes and engagements than before. By choosing who sees what you post, Facebook chooses what you post over the long run.
Ever notice how you’ll get a notification when an Instagram friend posts a photo for the first time in a while? That drives users to the photo, driving more likes and engagements, encouraging your friend to become a more regular user. Of course, they won’t reach that engagement total again, since the notification won’t be there.
If Facebook, say, wanted you to drink more alcohol, they could signal-boost photos of you at a bar, putting them on many people’s feeds and driving more likes. Your brain would subconsciously drive you to go to bars more, over the long run.
If Facebook, say, wanted you not to express your support for controversial political parties, they could stifle your posts in support of their ideas, limiting the engagement numbers. Meanwhile, they’d signal-boost your posts that are sympathetic to conformist political views, and you’d gravitate to that type of thinking over time. Sound crazy? They’re already monitoring your political preferences, with a high degree of accuracy. They’ve already run research studies to manipulate how people feel – why not how they think?
By manipulating how you feel, what you think is socially acceptable, and how you act, Facebook has the ability to manipulate your reality. I don’t know if they’re doing this, or whether they plan to. But the power is there, and they’ve shown they’re willing to use it in certain circumstances.
The Research Feeds The Profile
The combination of vast personal data and experimental capability gives Facebook the ability to make a near-complete model of your psyche. Facebook knows more about you than you know about yourself, and they can test how you react to things. After testing on you, they can combine the results with those from others in your demographic profile, to see how whole populations react to things.
It is within Facebook’s power to know the following:
- Are married people in Iowa more lustful than those in other states?
- Are you more likely to buy beer after being exposed to patriotic imagery?
- Are Pakistani-Americans more interested in rap music than Pakistanis in Pakistan?
- Are Japanese males more attracted to faces, or to asses?
- What News Feed content, and content timing, is most likely to get you to visit the Post Office?
- Are conservatives (or liberals) more extreme in their private opinions than their public ones?
- How can we make people more extreme politically? How can we make people more centrist politically?
All of this data and research can, and does, feed back into Facebook’s understand of you and your demographic. This information can be used to build a psychological profile of you, and predict how you will react to various stimuli. Those stimuli can then be introduced at will, pushing you to chosen emotions, actions, and outcomes.
Human Simulation & Virtual Reality
Facebook has all the tools necessary to create better human simulations than anything before seen. Aral Balkan goes as far as to argue this is their current business model:
Facebook isn’t a social network, it is a scanner that digitises human beings. It is, for all intents and purposes, the camera that captures your soul. Facebook’s business is to simulate you and to own and control your simulation, thereby owning and controlling you. [Emphasis in original]
Moving From Manipulation To Simulation
Facebook’s user manipulation, detailed above, is both ethically questionable and terrible PR. What if you could get the same results without actually testing on users?
Enter Facebook’s AI Research Division (FAIR). Their mission statement claims “objectives of understanding intelligence and building intelligent machines”, by “deriving knowledge from data”.
Here are some choice quotations from this insightful and/or terrifying article on FAIR:
It’s recreating a virtual memory of reality, and clustering it in the context of other places and events. It can even “virtually represent a person,” based on their previous likes, interests, and digital experiences. This is somewhat experimental, but has great implications for Facebook’s News Feed…
“If we have an idea that actually works, within a month it can be in front of 1.5 billion people,” LeCun said. [Yann LeCun is an AI pioneer, and Director of FAIR.]
Facebook is openly and proudly building the capability to simulate the human psyche. When such capacity is sufficiently advanced, there will be no need to test on actual users. Those users can be used only to verify results from the simulation.
Oculus & VR
In 2014, Facebook acquired Oculus, the leading consumer VR brand, for $2 billion. In his note about the acquisition, Zuckerberg claimed:
The incredible thing about the technology is that you feel like you’re actually present in another place with other people… One day, we believe this kind of immersive, augmented reality will become a part of daily life for billions of people.
Zuckerberg wants billions of people using his VR tech daily; engaging in a simulated reality with convincing representations of other people.
It’s not hard to see the link with the AI-fueled human simulation efforts. Simulated beings created from vast data collection will plug right in to Oculus’ virtual realities.
Virtual reality will create even better opportunities for Facebook to stimulate, manipulate, and track its users. Some possibilities:
- Eye and pupil tracking:
Eye and pupil tracking is a necessary component of a superior VR experience, and will solve many of the problems users currently encounter. Facebook and Oculus are working to include eye tracking.Now consider: your eye and pupil movements reveal how you’re thinking, and can be manipulated to influence how you’re thinking (source). Eye and pupil movement reveal how certain you are in your decisions, whether or not you’re lying, whether you’re thinking of a large or a small number, and your desires. There is evidence to show that manipulating what your eyes focus on influences your moral judgment.Facebook has proven able and willing to manipulate what users see and capture their reactions. Oculus with pupil tracking will allow it to manipulate what users experience and guess at their thoughts.
- Biometric tracking:
There are already VR games that are measuring and using biometric data including “EEG (brainwaves), GSR (stress levels), heart rate, and breathing”. While there’s no evidence that Oculus is currently developing attachments to track these variables, it’s not out of the question. And once you’ve got people wearing a dorky headset and holding a controller in each hand, what’s a wrist monitor? Biometric tracking could even be included in future versions of the controllers or headset.Biometric tracking would allow Facebook to collect stimulus-response data for your bodily functions, completing the picture. Their data and eye tracking would simulate and track your mind, while biometrics would simulate and track your physical response.
In summary: Facebook has the tools and incentives to not only simulate your psyche, but to manipulate and simulate your entire being as their platform merges with Oculus VR.
We covered how Zuckerberg has exclusive voting control over Facebook, due to its share structure. In effect, he can choose what Facebook does, with no checks or balances.
What are Zuckerberg’s motivations? Can we trust him with the power he will wield? Could we trust anyone in his position?
In February, Mark Zuckerberg published a manifesto entitled “Building Global Community”. I’m not going to critique Zuckerberg’s politics here, but I will note that he seemingly takes his own political preferences as obvious goals for the entire planet.
Whether or not Zuckerberg is planning a run for US President, it is clear enough that he has political ambitions in some form. He has palled around with Angela Merkel at the UN. He is touring all 50 US states in 2017, to understand why there is “a greater sense of division than I have felt in my lifetime”.
These are not the actions of a man content to sit on the sidelines, amassing his billions. Zuckerberg is making a concerted effort to move himself into the national political spotlight.
Imagine a politician with:
- The tools to “focus group” messaging and policy – not on small groups of 100 Pennsylvanians or Iowans, but on 2 billion global users
- The tools to directly influence the information available to voters
- The tools to measure and manipulate the political preferences of the electorate
Now imagine that same person is also the richest man in the world. Whether it’s money, persuasion, “fake news”, or anything else driving politics – Zuckerberg will have access to it all. Not a rosy picture, is it?
For your 2020 vision of 1984‘s boot stamping on a human face forever, take a look at the hundreds of dupes wearing Oculus headsets in this post’s header image.
Zuckerberg himself doesn’t have to run for office to wield this power. His dorky Harvard aura might not translate to votes. But the availability of this power to influence politics creates the power to influence policy anyway.
If you’re a Trump hater, this might boil your blood: Facebook’s own case study boasts that by using “the best content to influence voters”, they increased voter intent by ~10-20 points among Senator Pat Toomey’s target audience in the 2016 election. Toomey is a Republican and undoubtedly shared ballot slips with Trump. Trump won PA by just 44,000 votes.
Zuckerberg, The Man
Are we willing to trust one man with:
- The largest share of wealth on the planet?
- The biggest trove of private data ever assembled?
- The greatest control over information flow ever seen?
- The biggest psychological research facility in history?
- The most significant influence machine ever?
- All five?
Zuckerberg is human. As the saying goes, “power tends to corrupt, and absolute power corrupts absolutely”. Don’t forget this is the man who gave us this gem:
Zuck: Yeah so if you ever need info about anyone at Harvard
Zuck: Just ask
Zuck: I have over 4,000 emails, pictures, addresses, SNS
[Redacted Friend’s Name]: What? How’d you manage that one?
Zuck: People just submitted it.
Zuck: I don’t know why.
Zuck: They “trust me”
Zuck: Dumb fucks
A Structural Threat To Free Society
Let’s step back for a second and take Zuckerberg out of the equation. As we’ve seen, Facebook is the biggest personal data collector in history. It is openly working on simulating human beings for research. It has all the tools needed to manipulate people’s realities, emotions, thoughts, and political preferences. And it continues to build these capabilities, especially with virtual reality.
This alone is a risk center for our society. This centralization of private data, power, and influence is dangerous.
Now bring Zuckerberg in. He has political ambitions, and a questionable track record. Is it wise to leave exclusive control of this power in one man? Personally, I don’t trust Zuck. But that’s my opinion and intuition, so…
Let’s assume Zuckerberg is a saint, and won’t touch this power. Or only use it for good. History tells us that men either seize the power available to them, or someone who will seizes it from them. Wikileaks tell us that government(s) are already more than happy to insert themselves into these structures to further their own aims.
Facebook will be the most powerful tool for political power and manipulation in history. Someone, somehow, will take control of it. We are sleepwalking into allowing a gaping weakness to develop in our social and political structure.
I hope I’m wrong, and welcome all challenges to this argument. Post a comment below and I’ll be happy to engage.
Twitter’s management is too incompetent to take over the world. Follow me there, where I also post daily.