Alexis de Tocqueville observed in 1835 that in the transition from aristocracy to democracy, the tight hierarchical organisation of society dissolved into a broad horizontal mass of equal individuals. The challenge for democratic organisation, characterised by social equality, was collective organisation without a built-in leader. Tocqueville’s solution was the newspaper that could “put the same thought at the same time before a thousand readers,” and hold the like-minded together in an association in which equal “men combine for great ends.” 21st century readers of Tocqueville might view the rise of social-media platforms, such as Twitter and Facebook, as a technological development akin to newspapers and associations: a means of generating public communication and mass mobilisation on the basis of individual preferences. In this piece, I examine the social community on Facebook and its capacity to constitute a deliberative democracy, which is a political order where citizens have informed opinions and political equality in enacting common decisions. To this end, I employ James Fishkin’s ‘trilemma’ of democratic political values, namely deliberation, political equality, and participation, and contend that Facebook might appear to be an ideal democratic order but cannot be effectively democratic, and consider takeaways for the offline world.
Facebook is a popular social media network that constitutes an online community of 1.366 billion people, comprising 18.5% of the global population in 2015. This network enables users to connect to identified ‘Friends’ and public figures by observing, engaging, and interacting in shared online spaces. Facebook allows for individual expression that can be interacted with by other participants by means of social buttons, such as the ‘Reactions’ (including ‘Like’) and ‘Share’, which are plugins that “allow users to share, recommend, like or bookmark content, posts and pages.”
Deliberation is defined as a process of meaningfully “[weighing] the merits of competing arguments.” Facebook promotes access to information and freedom of expression that can facilitate deliberation, but access to poorly verified information may be inconsequential at best and deleterious to the public good. Facebook’s algorithms privilege posts and information that the user prefers, constituting a structural tendency towards enclave-formation that may empower minority voices but can degenerate to ‘echo chambers’.
Facebook enables free-wheeling, open communication and mutual sharing of information. ‘Public’ policies by Facebook are announced openly on their Newsroom. Private users can post a ‘Status’, write a note, and upload photographs. Individual liberty is maximised as the user retains control over who has access to their informational property. We might suppose that the fact of privacy means that information is not perfectly accessible, that some information will remain hidden from particular users. However, such a criticism is true in any other polity, and deliberation is concerned with information that affects a ‘public interest’ rather than those concerning private affairs. Furthermore, Facebook promotes freedom of expression in its minimalist rules on interference, and promotion of private posts into public viewership. The ease of communication allows the proliferation of a diversity of viewpoints, and sharing of information promotes user awareness and understanding of any number of issues.
One concern arising from free access is the lack of verification, and consequently a similar freedom for misinformation to propagate. Individuals are free to make any assertion online, and may be incentivised to “[wilfully blur the] lines between fact and fiction” to attract attention from other users, akin to tabloid journalism, and such misinformation is deleterious to deliberation. ‘Clickbait’, or “content [designed to] attract attention and encourage visitors to click on a link,” has become proliferate, with articles from websites such as Upworthy and Buzzfeed often ‘Shared’ across Facebook. The effective exploitation of “attention-grabbing, shareable reporting” to shape public perception, such as in ‘clickbait’, lines up squarely with Joseph Schumpeter’s skepticism of individual consciousness in belief-formation. For Schumpeter, self-interested citizens are characterised by their “ignorance and lack of judgment” on public policy, because they do not hold a sense of public responsibility since such matters entail negligible private payoffs. If the online multitude has its beliefs manufactured by strategic spin-doctors, then expressed preferences become poor indicators of ‘true’ preferences, and decisions are made by the strategic few rather than by independent actors comprising the many. In this vein, Facebook’s corrective action in the ‘war on clickbait’ in 2014 may improve the deliberative nature of the community.
Facebook algorithms construct an informational world personalised for each user, that privileges posts preferred by the user using data collated on their connections and activity. Content is ranked chiefly by “recency, post type and the relationship between the poster and the end-user” and includes other factors, combined by complex “calculus… in constant flux.” The privileging of preferred posts, owing the inputs such as time-spent reading posts of particular nature and previous posts ‘Liked’ by the user, increases the interactions between like-minded individuals who are concerned with the same issues, who may form interest and action groups within the community. Jane Mansbridge might call these “protected enclaves,” whose members not only consider the good of the entire community but what benefits them individually and as an interest group. These “groups of sequestered settings” can allow for shared experiences, mutual support, and production of counter-discourse that allow for diverse perspectives within the polity and prevent a monopoly on public opinion. But Mansbridge points out that whereas enclaves work towards empowering disaffected minorities, they may also exacerbate sectarian tendencies insofar as members of the groups “speak only to one another” and “protect their insights from reasonable criticism.” This constitutes an ‘echo chamber’ effect whereby such members are blindly convinced of the truth of their belief by virtue of mutual assurance, exacerbated by social buttons on Facebook where ‘Likes’ constitute affirmation. For Iris Marion Young, segregation endangers democracy by preventing public access to private spaces, minimising encounters with those outside of the ingroup, and enables the well-endowed to shut out those not similarly positioned and thereby reduce recognition of such relative privilege. Consider a Facebook group maintained for hardcore environmentalist advocates working on preventing further encroachment on natural spaces by urban-dwellers. For an overpopulated city, such activists might take extreme and uncompromising policy-positions, such as refusing development in the area of any existing secondary forest, disregarding all other considerations such as population squeezes or infrastructure development for less well-off neighbourhoods. The echo-chamber effect produced by segregation makes such groups uncompromising and unconstructive, thus detracting from the openness to alternative viewpoints necessary for proper deliberation. Thus the formation of ‘enclaves’, while potentially empowering for silenced minorities, nevertheless may subtract from openness by interest groups.
Political equality is giving “equal consideration of political preferences” in aggregation, such as in giving each man one vote when tallying. Facebook recognises the formal equality of individuals by allowing only one user account per person, but its aggregative algorithm may engender social stultification of irregular beliefs.
Facebook grants every account equivalent political ‘power’ in promoting the issues, statements, and public figures they care for. Facebook specifies in its Terms of Service that only “one personal account” is permitted per user. Administrators routinely remove ‘fake’ accounts that do not correspond to actual individuals, and require that accounts use individuals’ real names. These policies ensure that any individual can only have one account, and since every account is given voice in expressing the user’s beliefs on a public forum and the ability to ‘Like’ and ‘Share’ any post, all Facebook users have a guaranteed ‘vote’. Facebook’s eliminination of recorded online activity for deactivated and memorialised accounts, ensures that political activity reflects present concerns. The standard mechanism of ensuring political equality is “equal voting power,” institutionally expressed as ‘one person one vote’ (OPOV). On Facebook, an individual’s ‘status update’ or a public figure’s ‘page’ can receive one and at most one ‘Like’ from any user, which is a mechanism of ‘one person one Like’ (OPOL). ‘Likes’ are “intentional affective reactions” that constitute a form of “[validation]… in the social.” Because the ‘Like’ “[comes] with a counter showing the total number of likes as well as the names of friends who clicked it,” it operates as a form of preference-aggregation computing individual expressions of affection into a sum of public support. That the public nature of the ‘Like’ deviates from the secrecy of the ballot box, does not subtract from its formal recognition of equality between users, even as it might detract from genuine preference expression. On this read, OPOL is comparable to non-compulsory voting in the aggregative democracy.
Political equality in a democracy gives rise to attendant concerns. Carolin Gerlitz and Anne Helmond find that the “‘Like’ economy” changes the “visibility of links”: posts or pages with higher ‘Likes’ by the user’s Friends are preferentially placed on their ‘newsfeed’. This might correspond to major parties’ dominance of news coverage in a democratic system relative to minor parties. Such aggregative privilege resonates with Tocqueville’s worry on the ‘tyranny of the majority’, in which the masses collectively determine policy that may be to the detriment of a minority. Facebook does not determine policy solely on user input, however, which lays the technical component for Tocqueville’s concern to rest only to deny effective political participation. The social-epistemic component of Tocqueville’s challenge rests in his observation that “the majority has enclosed thought with a formidable fence.” Here Tocqueville fears that opinion contrary to the majoritarian view will not be expressed for fear of social opprobrium, and so effect a corrupting state of despotism, embodied by men who are “strict slaves of slogans.” John Stuart Mill echoes this belief in his argument that “to [have opinion] be restrained in things not affecting their good, by their mere displeasure… dulls and blunts the whole nature.” To the degree that ‘Likes’ constitute social validation, and the aggregation of social validation curtails individuals’ unwillingness to voice objections, Facebook’s aggregative mechanisms may enable despotism within the community.
Political participation is “behaviour… [by] the mass public directed at influencing” public policy. The inclusivity of Facebook, comprising freedom of association and minimal constraints on user expression, permits active user participation at their leisure; but effective participation is constrained because Facebook’s policies are not determined by the users.
In the discussion on deliberation, we considered how Facebook permits maximal information access within the constraints of individual privacy, and enables freedom of association with the like-minded. This encourages participatory behaviour insofar as users are not constrained from free expression of their ‘Likes’, which we compared to votes in a democratic polity. The ‘Reactions’ button, a revision of the ‘Like’ introduced in 2016, permits users to register a range of emotive response including surprise (‘Wow’), strong affection (‘Love’), unhappiness (‘Sad’), or objection (‘Angry’). The flexibility of participatory methods entails an expansion of preference-expression options beyond the vote, which better captures the beliefs of the responders. For example, if a friend shares an article on commercial seal hunting, animal-loving users can opt to ‘Like’ it to demonstrate support for the message to stop animal cruelty, or opt not to ‘Like’ it to withhold support for the hunting practice. (Fig 1) This expressive mechanism is ambiguous, and does not express the depth or nature of emotive response. The ‘Reactions’ mechanism allows users to be ‘Sad’ or ‘Angry’. This expansion of range and depth of expression allows for nuanced preference-aggregation, since the algorithms computes the sum of responses by nature and in total (Fig 2). Because ‘one-person-one-reaction’ is conveys more information than ‘one-person-one-vote’, Facebook captures individual preferences more accurately than, and therefore improves on, existing aggregative models.
|Fig 1 : a post against commercial seal hunts||Fig 2 : aggregation of reactions to the post with breakdown by nature of response
(16 responses: 8 ‘Sad’, 4 ‘Like’, and 4 ‘Angry’)
The fundamental problem with our democratic conceptualisation of Facebook lies in the necessary condition of political participation. Political participation requires that citizens operate to influence public policy. However, whereas the citizen-body (the user population) of Facebook is highly active on a broad range of questions that may pertain to the large private groups, we can conceive of the ‘laws’ of the Facebook polity as the rules set by administrators and algorithms coded into the platform by software developers. The ‘public policy’ that Facebook as a community is concerned with pertains only to these ‘laws’, because Facebook constitutes a minimalist polity that has few restrictions on usage. User accounts do not require sustenance, and freedoms are cheaply provided for by the administrators, and this low cost of needs-provision engenders the minimal ‘state’. However, ‘state policy’ (the ‘laws’ of Facebook) is precisely where user input is indeterminate. Facebook responds to user-feedback at its own discretion, and makes administrative decisions that are formally unaccountable to its users. In its 2015 “Statement of Rights or Responsibilities”, alternately known as the “Terms of Service,” the administrators defined the rights which users might have, “[reserved] all rights not expressly granted to [the user],” and commits only to notifying users when amending the terms. Since political participation is a necessary condition for democracy, and Facebook does not have any effective participatory mechanism for the citizen-body, it cannot be a democracy in spite of how democratic the community may seem!
By our metrics of deliberation, political equality, and participation, we find that Facebook’s democratic credentials are mixed. Policies are determined not by the people, but by a team of faceless administrators, who take user input into account at their discretion and reserve all rights to amend any rule. There are elements of Facebook that democratic polities in a technological age might, including a state minimalism that promotes negative liberties and the differentiated ‘response’ approach to preference-expression that can capture the public will more accurately than vote-aggregation models. However, insofar as such ideals have not been democratically realised at present, we should soberly consider the cautionary tales of the 20th century, when vanguards of administrators attempted to impose their purported utopias. The visible freedoms of Facebook might be but the infinite liberty of a clueless multitude, frolicking in the garden carefully cultivated by the vigilant nomenklatura.
I would like to sincerely thank :
- Professor Sandra Field, for her guidance and teaching;
- Feroz Khan ‘18 for proofreading this paper and providing helpful feedback;
- Darrel Chang ‘19 for permitting my use of his Facebook posting as a case study (Fig 1);
- Adila Sayyed ‘19, Aditi Kothari ‘19, Daryl Yang ‘18, Isabelle Li ‘19, Jerald Lim ‘19, Jillane Buryn ‘18, Justin Ong ‘19, Le Van Canh ‘19, Louis Ngia ‘19, Martin Vasev ‘18, Mehul Banka ‘19, Teo Xiao Ting ‘18, Teo Zhi Xin ‘19, Yang Yilin ‘19, Yong Yu Qi ‘19, and Zhong Rui Feng ‘19 for contributing assistance to the case study (Fig 2).
- Bai, Tongdong. “A Mencian Version of Limited Democracy.” Res Publica 14, no. 1 (2008): 19-34. doi:10.1007/s11158-008-9046-2.
- Chen, Yimin, Niall J. Conroy, and Victoria L. Rubin. “Misleading Online Content.” Proceedings of the 2015 ACM on Workshop on Multimodal Deception Detection – WMDD ’15, 2015. doi:10.1145/2823465.2823467.
- Fishkin, James S. When the People Speak: Deliberative Democracy and Public Consultation. Oxford: Oxford University Press, 2009.
- Gerlitz, Carolin, and Anne Helmond. Hit, Link, Like and Share. Organizing the Social and the Fabric of the Web in a Like Economy. Conference Paper. 2011.
- Gerlitz, Carolin, and Anne Helmond. 2013. “The Like Economy: Social Buttons and the DataIntensive Web.” New Media & Society 15 (8): 1348–65. doi:10.1177/1461444812472322.
- Mansbridge, Jane. “Using Power/Fighting Power.” In Democracy and Difference: Contesting the Boundaries of the Political, edited by Seyla Benhabib, 46-66. Princeton: Princeton University Press, 1996.
- Mill, John Stuart, and John Gray. On Liberty and Other Essays. Oxford: Oxford University Press, 1991.
- Schumpeter, Joseph. Capitalism, Socialism and Democracy. London: Routledge, 2013.
- Tocqueville, Alexis De, J. P. Mayer, and George Lawrence. Democracy in America. New York: Harper Perennial Modern Classics, 2006.
- Global Social, Digital & Mobile Statistics. Report. Wearesocial, 2015.
- Luckerson, Victor. “Here’s How Your Facebook News Feed Actually Works.” Time. July 9, 2015. Accessed April 27, 2016. http://time.com/3950525/facebook-news-feed-algorithm/.
- O’Neil, Luke. “The Year We Broke the Internet.” Esquire. 2013. Accessed April 26, 2016. http://www.esquire.com/news-politics/news/a23711/we-broke-the-internet/.
- Oremus, Will. “The Onion’s New Website Is More Than Just a Hilarious BuzzFeed Parody.” Slate Magazine. 2014. Accessed April 26, 2016. http://www.slate.com/articles/technology/technology/2014/06/clickhole_the_onion_s_new_site_is_more_than_a_buzzfeed_parody.html.
- “Current World Population.” World Population Clock: 7.4 Billion People (2016). Accessed April 24, 2016. http://www.worldometers.info/world-population/#pastfuture.
- “Facebook.” How News Feed Works. Accessed April 27, 2016. https://www.facebook.com/help/327131014036297/.
- “Facebook.” What Names Are Allowed on ? Accessed April 24, 2016. https://www.facebook.com/help/112146705538576.
- “Facebook Logo.” Terms of Service. Accessed April 24, 2016. https://www.facebook.com/terms.
- “Making Page Likes More Meaningful.” Facebook for Business. Accessed April 24, 2016. https://www.facebook.com/business/news/page-likes-update.
- “Facebook Wages War on Click-bait.” The Sydney Morning Herald. Accessed April 26, 2016. http://www.smh.com.au/digital-life/digital-life-news/facebook-wages-war-on-clickbait-20140825-108dd8.
- “Reactions Now Available Globally | Facebook Newsroom.” Facebook Newsroom. Accessed April 26, 2016. http://newsroom.fb.com/news/2016/02/reactions-now-available-globally/.
 Tocqueville (2006) [1835, 1840] Vol II, Part I, Ch 6. 517.
 Tocqueville (2006) [1835, 1840] Vol II, Part I, Ch 7. 521.
 These are the democratic values defined by Fishkin, and described by him as a ‘trilemma’. Fishkin (2009). ‘The Trilemma of Democratic Reform,’ p1.
 Figure obtained from Global Social, Digital & Mobile Statistics for January 2015. The figure from the same report of the previous year (January 2014) was 1.184b, so that single year saw a leap of 182 million people, or 15.37% from the figure for 2014.
 Global population for 2015 taken as 7,349,472,099, from the World Population Clock. Calculation was done based on 2015 figures since 2016 data for Facebook usage has not been published.
 ‘Reactions’, introduced in 2016, allow users to express a range of emotive responses to particular posts. These are ‘Like’, ‘Love’, ‘Haha’, ‘Wow’, ‘Sad’, and ‘Angry’. (Facebook newsroom, 24 Feb 2016.)
 ‘Like’, introduced in 2009, allows a user to express generic support or affection for a post, page, note, or activity. ‘Share’, introduced in 2006, allows a user to highlight a particular post, page, note, or activity, and pin it to their own “wall” for their friends’ viewership. Gerlitz and Helmond (2013). 1351.
 Gerlitz and Helmond (2013). 1348.
 Fishkin (2009). ‘The Trilemma of Democratic Reform,’ p3-4.
 See Facebook newsroom: newsroom.fb.com.
 Fishkin defines information and a diversity of views as two of five criteria that determine the quality of deliberation. Fishkin (2009). ‘The Trilemma of Democratic Reform,’ p3-4.
 O’Neil (2013). Esquire (online source).
 The analogy between ‘clickbait’ and tabloid journalism is made by Chen, Conroy, and Rubin. Chen, Conroy, and Rubin (2014). 1.
 Oremus (2016). Slate (online source).
 Schumpeter (2013) . Ch XXI, 261.
 Report by the Sydney Morning Herald, Digital Life section.
 “Facebook.” How News Feed Works.
 Luckerson (2015). Time (online source).
 Mansbridge (1996). 57.
 Mansbridge (1996). 58.
 Fishkin defines substantive balance as the extent to which arguments from alternative perspectives are offered, conscientiousness as the extent to which their merits are “sincerely weighed,” and equal consideration as the need to view the perspectives on their own merits regardless of the proponents. These are the other three criteria for meaningful deliberation developed, apart from information and diversity earlier discussed. Fishkin (2009). ‘The Trilemma of Democratic Reform,’ p4.
 Fishkin (2009). ‘The Trilemma of Democratic Reform,’ p15.
 “Facebook.” What Names Are Allowed on ? Accessed April 24, 2016. https://www.facebook.com/help/112146705538576. There are exceptions to Facebook’s ‘real name’ policy (see the ‘Real Name’ controversy in 2014) granted where the individual’s safety is compromised by using their real name, but Facebook imposes checks to ensure that the individual can still only operate one Facebook account.
 Individuals who expire will have their accounts ‘memorialised’: left for posterity, but deactivated.
 “Making Page Likes More Meaningful.” Facebook for Business. Accessed April 24, 2016. https://www.facebook.com/business/news/page-likes-update.
 Fishkin (2009). ‘The Trilemma of Democratic Reform,’ p15.
 Bai (2008). 19.
 Gerlitz and Helmond (2011). 15.
 Gerlitz and Helmond (2013). 1352.
 The secret vote is a mechanism for obtaining the ‘genuine’ preference of voters, since there is no fear of reward or punishment according to how an individual votes.
 Gerlitz and Helmond (2011). 15.
 The ‘newsfeed’ is the user’s regular area for social updates from their ‘friends’, acquaintances, or public figures.
 Tocqueville (2006) [1835, 1840] Vol I, Part II, Ch 7. 250.
 Tocqueville (2006) [1835, 1840] Vol I, Part II, Ch 7. 255.
 Tocqueville (2006) [1835, 1840] Vol I, Part II, Ch 7. 258.
 Mill (1991)  On Liberty, Ch. III, 70.
 Fishkin (2009). ‘The Trilemma of Democratic Reform,’ p17.