Please do not cite or circulate the printed version of this HTML document. Please refer instead to the original online version of this document, which is available online at , or contact the author(s).

Original Research Article

What's in the Social Graph?

An Enquiry into a Technical Protocol

Fernando N. van der Vlist

(College of Humanities,) University of Amsterdam, the Netherlands

Working paper

Published online: 13 July 2013

Abstract

In this paper I develop a historical and discursive understanding of the Social Graph. It develops two lines of investigation: first, an epistemological debate is traced that attempts to solve issues of the real-virtual divide. Based on this historical trajectory, a methodological pragmatism becomes apparent that also appears in graph theory as applied to the Social Graph. “Big Data” undermines these issues of a divide by relying on an epistemology of patterns. However, these patterns are ultimately grounded online as well. Second, this epistemological trajectory is connected to current social, economic, and scientific discourse – situating Big Social Data analysis, the Social Graph, and the Open Graph protocol. To this end, Facebook's Social Graph is used as a case study throughout. Ultimately, I argue that the Open Graph, as a supplementary and enforcing protocol to the Social Graph, represents Facebook's normalising strategic direction towards developing a cybernetic infrastructure that rewards those who conform to its commercial ideals, while punishing those who deviate from it.

Keywords

technical protocol, epistemology, discourse analysis, virtuality, Social Graph, Big Social Data

technical protocol
epistemology
discourse analysis
virtuality
Social Graph
Big Social Data

1. Introduction

In this essay, I am concerned with the implications of two particular technical protocols on real-world social, economic, and scientific discourse. In the Science and Technology Studies (STS) tradition, many authors have argued on the mutually constitutive relationship between technical objects and society more generally (Latour and Woolgar [1979] 1986; Bijker and Law 1992; Bowker 1992; Bowker and Star 1999). Extending this tradition to the digital domain (particularly in software and platform studies), Alexander R. Galloway has pointed to the importance of technical protocols in making possible network connections and disconnections (2004). Others pointed to analytical problems in codecs (Mackenzie 2008), and still others have investigated computational algorithms (Bucher 2012; Rieder 2012) or programming languages (Cox and Ward 2008; Chun 2011). Here, I contribute to these efforts by investigating – from a historical and discursive point-of-view – some of the epistemological assumptions in Facebook's Social Graph, which is supplemented and enforced by the Open Graph protocol. This particular object is interesting as a case because it is currently among the most ambitious projects to represent social reality. As such, it is subject to some fundamental philosophical issues discussed here, which has real social, cultural, economic, and scientific consequences.

Throughout, I will be weaving together two separate trajectories. On the one hand, I trace an epistemological debate by looking at the social graph as a representational model. On the other hand, I relate this to current social and economic discourse – situating Big Social Data analysis, the Social Graph, and the Open Graph protocol. Let me stress that this is not a paper that condemns data science or graphs altogether. Graphs are extremely useful for many purposes, and data allows us to develop empirical insights and claims about otherwise obscured aspects of reality. Instead, I want to argue that we should be careful in selecting the right data, knowing the full story behind the data capturing, and raise awareness of the epistemological and ontological issues that remain with these analyses.

In the first section, I focus specifically on the real-virtual divide and the epistemological issues it poses (Hayles 1993; Woolgar 2002; Rogers 2009). Second, I suggest some inherent inherent limitations when using graphs as epistemological device. This includes the critique on why the Social Graph is neither (Fitzpatrick 2007; Ceglowski 2011), which also introduces the Social Graph on the level of the Facebook interface (Suchman 1984; Madrigal 2013). Third, I connect pattern epistemology (Alexander 1977; Hayles 1993; Dixon 2012) to “Big Data” analysis (boyd and Crawford 2011, 2012), relating the concept of apophenia – seeing meaningful patterns where there are none (Dixon 2012). With this epistemological foundation, it still remains problematic to use the Internet as a very large scientific instrument for quantifying and studying the social, because patterns emerge specifically from a given system. Based on the discussed trajectories, I finally argue that Facebook is strategically positioning itself towards building and normalising (Foucault [1978] 1990) the commercially driven Web as a cybernetic system (Wiener [1948] 1961; Dušan 2011; Gerlitz and Helmond 2013).

2. An Epistemology for the Virtual

In spirit of the cyber-libertarianism represented by John Perry Barlow's manifesto “A Declaration of the Independence of Cyberspace”, the awareness was raised that cyberspace could be something radically different from physical space. Cyberspace refers to objects and identities that exist largely within the network itself, making it a place by and for the mind. But in the end, cyberspace is all just information (1990, 34–43; 1996). Today, this utopian metaphor assumes a more nuanced form, more in line with Foucault's heterotopia (1984, 46–49), as a space of non-hegemonic conditions that has meaning and identity only in relation to the real world. Instead of a substitute or replacement of the real, virtuality becomes an aspect of reality. For Internet research, however, the ontological distinction between the virtual and the real leads to problems in grounding claims about virtuality that are often overlooked. Following the historical trajectory, what we see as a result is a methodological pragmatism that avoids the problems of a singular ontology for both the virtual and the real.

2.1. Virtuality and Embodiment

In “Virtual Bodies and Flickering Signifiers”, N. Katherine Hayles located and traced a radical epistemological break that stems from debates on technological virtuality and cyberspace. As she argues, “different technologies of text-production also suggest different models of signification” (1993, 69). Fundamental to her argument is that information is never actually present, even though it's potentially everywhere. It is the basis of much of the actual world, but is never present in its authentic form. We see letters on pages in books, flickering pixels or characters on screens, but that is only its material manifestation. She conceptualises this as the embodiment of information, which is the result of the interplay between inscription – the body as conceptual abstraction prior to human perception – and incorporation – the particular embodiment that emerges from its specific context (Hayles 1999)1. Information, in that sense, is a virtual presence of patterns, and what we perceive as real is just the material embodiment of these patterns. This argument assumes that information is more abstract and “pure” than the world we perceive empirically through our human senses.2

Hayles develops the notion of flickering signifiers to refer to the production of virtuality. In the Saussurean tradition, signification is based on a dyadic relationship between signifier (form) and signified (mental concept). The relation between the two is a sociocultural product, which constitutes what he called the “arbitrary nature of the sign” ([1916] 1965, 67–70). Hayles reads this as materiality; images and characters on a screen, or in a book, are just different in terms of the processes that produce them. Her notion of flickering signifiers takes Lacan's floating signifiers (Lacan 1975, 22, 35) a step further, because in informatics, a written piece of text is itself a code (Hayles 1993, 76). Lacan based his notion on the absence of a stable symbolic link between signified and its signifier. Flickering signifiers, on the other hand, are layers of arbitrary signs functioning on top of each other. “A signifier on one level becomes a signified on the next higher level” (77). The dichotomy of presence/absence is replaced by one of pattern/randomness (78). The arbitrariness of the sign is used to constantly rearrange signs into different patterns; it is constantly changing, flickering, whereby “the informational structure emerges from the interplay between pattern and randomness” (76). Information only exists virtually, as a potential. This conception of signification and embodiment constitutes a radical epistemological turn, and can be seen as an attempt to develop an epistemological foundation for studying the virtual.

2.2. Five Rules of Virtuality

The Virtual Society? research programme (1997–2002)3 marks a second turning point. In an attempt to study the social dimensions of electronic technologies, this research programme set out to claim some likely effects of how new electronic technologies amount to our transition to a virtual society (Woolgar 2002, 2). In “Five Rules of Virtuality”, Steve Woolgar, who was director of the programme, traces the genealogy of the emerged question mark in the title of the programme. In his view, the question mark “signals the spirit of analytical scepticism that needs to run in concert with balder depictions of technological impact” (10). Adding this question mark signals a larger shift towards academic caution and was meant to reassess the “cyberbolic overtones” in the programme specification draft. This academic caution is also apparent in the work of Julian Dibbell and Steven Shaviro, who both show that it is problematic to consider the virtual as a separate realm (Dibbell 2005; Shaviro 2007). Instead, phenomenology in the case of Dibbell's description of a rape in cyberspace, and the economic logic of scarcity/abundance in the case of Shaviro exemplify that virtual worlds extend conventions imposed by the real world.4 Particularly the deterministic understanding of electronic technology having “social impact” therefore needed to be reassessed and lead to recommending a focus upon studying “implications” rather than blunt impacts. This would encourage discussion on the ways in which new technologies challenge existing social-scientific concepts (22).

The efforts of this programme ultimately resulted in formulating five “rules of virtuality” (14–21).5 These five broad analytical themes provide rules of thumb for evaluating some of the deterministic claims of the effects of new technologies, and later became known as the digital divide critique. These rules emphasise an unequal divide in terms of access (first rule), and the fears and risks associated with new technologies (second rule). In respect to the division between virtuality and the real, Woolgar argues that virtual technologies supplement rather than substitute real activities (third rule), and also that the virtual can sometimes be more real than reality itself (fourth rule). The third rule in particular shows how the virtual as a separate realm is being problematised. The virtual tools provided by the Internet were used alongside other resources and became embedded in people's lives. Woolgar does, however, still acknowledge that this means a transcendence of the boundaries between reality and the virtual (17). As a fifth rule, the global and the local become increasingly intertwined. Globalisation marks the death of physical distance, but instantiations of the global are at the same time highly local.6 Woolgar links this in particular to the formation of identity, which for him is fundamentally dependent on the local (read reality) in order to transcend the global (read virtual). If anything, these five rules indicate that the real and the virtual are interconnected in complex ways.

The “virtual methods” (Rogers 2009, 5) deployed to research these themes studied the Internet by using conventional methods from social sciences and humanities, and applying these to the new object of study. We see this for example in the work of Don Slater and Daniel Miller who localised their comparative ethnographic study of the Internet in Trinidad and Tobago, resulting in a set of interconnected dynamics that challenged the idea of cyberspace as a separate realm (Slater and Miller 2000). User studies based on surveys, observation, or interviews take the real and local as the baseline for doing research on virtuality, leaving the study of the medium and the data it generates generally untouched.

2.3. End of the Virtual

A third turning point does away with the virtual altogether. In his inaugural lecture “The End of the Virtual: Digital Methods”, Richard Rogers proposes to study the data of the new medium itself, through what he calls “digital methods” that follow the medium (10). In order to be able to develop claims about society and culture, this methodological turn required that the division between the real and the virtual be removed. Rogers therefore proposed “online groundedness” (5) which is the ontological and epistemological basis to ground claims of research with digital methods. This is a radical epistemological break that enables studying culture and society with the Internet. The choice to call for the end of the virtual-real divide, then, is a methodological one, because it allows for the study of the Internet as a source of data about society and culture at large (29), offering opportunities for studying far more than online culture alone.

The division between the virtual and real, however, is still present in the ontological distinction between the study of digitised versus natively digital objects. It is generally accepted that the ontological status of a digitised object such as a scanned book or image is fundamentally different from one that is born digital such as the link or the tag. This ontological distinction also becomes the basis for the division between two broad directions in the larger project of software studies: Cultural Analytics7 and Digital Methods8. What we see here is based on the same division between the real and virtual; the virtual is still a separate and disconnected realm, except if we acknowledge there is a synchronous relationship between data of the digital medium, and society and culture. More importantly, this implies that the distinction between virtual and real is to a certain extent produced9 and upheld on the institutional level.10 It is important to remember that this turn does not mark the end of the virtual; rather, our methods to study virtuality have taken what we may call a pragmatic turn. Looking closer at this pragmatic direction, we may start to critically consider practices that make use of this methodological pragmatism to ground certain claims about reality. As introduced earlier, the case that will be studied here is that of the Social Graph.

3. Methodological Pragmatism and the Social Graph

As the concept of network started to appear in more diverse and immaterial contexts, it has become an epistemological device for knowing the relations of things, and for validating that knowledge. An example of this from (discrete) mathematics and computer science is graph theory, which uses graphs to model pairwise relations between objects. In the second half of the twentieth century, graph theory would become a “noble” endeavour and asset to the social sciences as well, where it proved particularly fertile to deal with the emerged problem space concerning social structure (Rieder 2012). Stanley Milgram, for example, conducted a famous experiment (1967)11 where he measured the number of connections it would take for a chain letter to travel from Omaha, Nebraska, to Boston, Massachusetts. He discovered that the number of connections varied between two and ten, with a median of six. His discovery of this “universal” social network property has become known as the “small world phenomenon”, or “six degrees of separation”. However, this kind of study quickly becomes rather reductionist or even deterministic when it starts taking the common proverb “tell me whom you frequent with and I will tell you who you are”12 as an all-embracing truth. The following points out some limitations of graph theory when applied to the Social Graph, as it has been popularised by Facebook.13 Specifically, I criticise those assumptions that take graphs to be a fit (i.e. epistemologically and ontologically robust) model for representing, defining, and structuring the semantic ambiguities of actual social networks.

3.1. Arbitrariness of Graph Topology

The formal structure of a social network defines our understanding of relations and communities, and the graph, as a representational model behind this network, constitutes the basis of that structure. Graphs fundamentally consist of objects (nodes) and links (edges). Computer systems require formal and clear definitions of what these entities and relations represent, which means that designing a graph also forces such definitions. Bowker and Star's important analysis of classification systems and their consequences indicates the diverse range of social and political consequences of categories and names, which they understand as units in the larger infrastructures (1999, 35). By defining what a relation is, the system (by extension) also directs who becomes its hubs, which corners of the network become continents (Barabási 2002, 165–169), or how many “degrees of separation” separate x from y. Definition thus constitutes the network topology, which has real social, political, and economic consequences.14

Johan Ugander et al. published an article on the anatomy of Facebook's Social Graph, where they studied numerous features of the graph such as the number of users, friendships, degree distribution, clustering, and others (2011, 1). Their research mostly analysed specific network structures on Facebook, but in doing so it also emphasised the importance of features in the study of this graph. By tagging an image, for example, you also make that image instantly public to that person, plus all of his friends, and the friends of his friends.15 Social interactions lie at the core of the design of Facebook as an interface that is constitutive of this Social Graph, and according to Stephen Downes, “the design of the interaction is important and defines the nature of the community” (2007). To illustrate the latter, Cass Sunstein argued that specific communities also lead to specific consequences (2001, 54). He empirically studied and proved the importance of the specialisation of websites. In the same way, the specialisation of communities is important as well for social fragmentation and what he called cybercascades – an informational snowball-effect of group polarisation closely related to social cascades. Consider for example, how “strangeness” is religiously excluded from networks structured by similarity.16 Graph topologies are arbitrary, and if you can shape the conditions for social interaction to emerge, you also hold the power to direct that topology.

3.2. Coded Relationships and Situated Actions

Extending the argument above, graphs do not, and cannot capture actual social relationships. In the context of digital platforms, relationships are coded. That is to say that complex social relations are converted into digital labels. In this regard, Maciej Ceglowski, extending the thoughts of Brad Fitzpatrick before him (2007), argued that the Social Graph is neither social, nor a graph (2011). In order to define relationships, common vocabularies are used in the form of predefined relations (acquaintances, close friends, restricted, and so on). But even if we define all these connections with other people it will still not be expressive enough. As Fitzpatrick explains, in order to model something as a graph, you will first need an understanding of what each node and edge represents. But this is highly problematic, for some types of relationships are defined by their secrecy or ambiguity (such as crushes), and defining them would kill the very category. Such examples are countless. Second, the Social Graph is not social either. As Ceglowski effectively put it, “there's just no way to tell if you'll get along with someone in my social circle, no matter how many friends we have in common” (2011). It's naive to think this is even remotely social.

So this raises the question of why there is (still) a widespread belief in the Social Graph – in Big Social Data as a consequence – that supposedly tells us exactly what is going on? One possible explanation may be that Facebook and other similar companies greatly profit by having us all believe that the data they collect is useful and valuable. Data has repeatedly been called the new oil; and like crude, it needs to be refined before it can really be used. This may very well be one of the reasons why Facebook tries to seduce its users into entering as much information as they can into the predefined fields.17 The interface acts as “empty vessel” for your mind, which structures social life, and which Facebook tries to design and perfect (Madrigal 2013). This planning creates what Lucy Suchman called “situated action” (1985), which – from an anthropological point-of-view – is a problematic aspect of human-computer interaction, because human action is constantly reconstructing.

There is necessarily a politics of reduction behind an interface, but as illustrated by Web designer Mike Monteiro: “Where I have issues with Facebook is that they're dishonest about who the customer is. They've built an enticing chair, and they let me sit in it for free, but they're selling my farts to the highest bidder” (qtd. in Madrigal 2013). Is their data really that useful? Facebook's recently introduced Graph Search18 has already been said to lead to many results that do not make sense.19 Turns out that the quality of the data is just not that good. What it comes down to is that you cannot accurately represent the social with a graph, but it can be used to link valuable news and offers to a person, in order to “sell you crap” (2011).20 Extending this analysis of the Social Graph, the following concerns an underlying epistemological construct that validates the Social Graph as such.

4. Timelessness, Information and Patterns

As I have tried to argue, graphs are problematic to accurately represent actual social networks. At the same time, however, we also see more and more research and companies turning to “Big Data” to develop robust claims about the social. These claims are economically valuable, because they allow companies to target specific groups or observe and predict changes in the social ecosystem (understood here as a set of interacting systems). These claims ultimately follow from the same Social Graph, but seem to validate knowledge in a different manner (see section 2). Instead of validating knowledge in “online groundedness” (Rogers 2009, 5) as a result of two incommensurable scientific paradigms (Kuhn [1962] 1996), they base their claims in a similar fashion as Hayles did by separating embodiment from information as pattern in order to be able to ground claims concerning both. The general assumption here is that patterns are more fundamental, underlying material embodiments like the Social Graph.

4.1. Undermining the Morphology of Network Structures

In “Analysis Tool or Research Methodology: Is There an Epistemology for Patterns?”, Dan Dixon argues on the importance of patterns for both hard and soft sciences, and notes that both design and the digital humanities are also very concerned with patterns. Generally, patterns are used for studying relationships in data, shapes that emerge in graphs, statistical accounts of data sets, emergence of relationships (Dixon 2012, 191). As Dixon notes, “The study of patterns is therefore a morphology of these [emergent] structures, not just for their own sake, but to analyse underlying forces, the network structure, or the system that is actually of primary concern” (198). The idea of patterns as fundamental to emergent structures was also observed by Austrian architect Christopher Alexander, in his seminal work A Pattern Language: Towns, Buildings, Construction (1977). In this catalogue of design patterns, Alexander develops a language of patterns on different scales that can be combined in an endless number of combinations to build specific architectural structures. His underlying intention was primarily aesthetic, because as a modernist, his goal was to describe a framework for a timeless way of building.21 Such structures would be resistant to time, because the patterns capture underlying rules of the emergent structures (e.g. building typology). Patterns, like graphs or other diagrams, can teach us things about the relationships between the component and the system, or the node and the network. Still, it is important to realise that patterns emerge directly from the system being examined (Dixon 2012, 198). That is why the limitations of classification and graph theory are essential in this regard.

Dixon also introduces the notions of abductive reasoning and apophenia. The first is a concept borrowed from Charles Sanders Peirce, and means the method by which hypotheses are created or discovered. Dixon applies the term to mean the spotting of patterns and relationships in data sets (201). As he argues, abduction is the spotting of patterns, so apophenia becomes the spotting patterns when there are none (202). This is an interesting concept to think about different types of patterns (Dixon identifies three of these; designed, evolutionary, and natural), and critically assess whether they are actually valid patterns. As he put it: “There is still the problem of identifying useful and valid patterns, because not every shape or structure seen is useful or valid, and it is very easy to see patterns where there are none and end up in the realms of apophenia and numerological digressions” (201). Aligning this with Hayles” theory of information, pattern is also necessarily observed in its material embodiment. It is, therefore, still problematic to argue whether there is a causal relationship between information as pattern and the embodiment of that pattern. Prediction based on pattern (e.g. probability measures) is thus prone to apophenia as well. Before I engage further with cybernetics (see section 5), however, I first need to relate this epistemology of patterns to actual quantifiable values (i.e. data) that can be used as input and output (I/O) in such models.

4.2. “Big Data” as a Cultural, Technological, and Scholarly Phenomenon

Big Social Data analysis is one example of how networks and patterns are used as epistemological device to study relations and to validate that knowledge. Because the data represents the network graph, and the graph supposedly represents the social. Danah m. boyd and Kate Crawford offer some starting points for a critique on the “Big Data” mindset (2011, 2012). Among the myths they tackle is one that states, “with enough data, the numbers speak for themselves”. This assumption is based on the robustness of the pattern. But numbers do not speak for themselves, because they remain objects of human design, which makes them vulnerable to skews, gaps, and faulty assumptions (2013). As one of its characteristics, they identify “mythology” in “Big Data”, by which they mean a widespread belief that larger data sets also offer more accurate truths, objectivity and greater accuracy (2012, 663). “Big Data” is not just about having a large data set; it is also cultural (e.g. mythological), technology (e.g. issues of classification, databases, algorithmic accuracy), and analysis (e.g. tools, epistemology, ontology). On top of unrefined data sets, the algorithms that are used to analyse or that created the data sets in the first place are just as biased (i.e. reductionist). This interplay of different factors is essential to consider when applying this “Big Data” logic in practice. I will illustrate this interplay with a recent example.

Declan Butler wrote an article on Google's web-based method for tracking seasonal flu. Google Flu Trends22 used algorithms to estimate prevalence from flu-related search queries. But the US peak flu levels turned out to be drastically overestimated (2013), which had a large effect on public services and public policy. Epidemiologist John Brownstein therefore argues that such computational models regularly need to be recalibrated, because they do not exist in a vacuum (Crawford 2013). For this purpose, he developed a sentient system called Flu Near You23 where people volunteer to report the illness, resulting in insights about the spreading of the illness24. This model harnesses the advantages of the Web to collectively organise behaviour as if we all belong to a large “hive mind” (Kelly 1995) with a collective intelligence and sense. The sentient system builds the dataset through a specific type of social network of reporting doctors or ordinary people. Graphs can only capture so much, and their scope is limited. They cannot represent the social in its totality, but can they represent a single aspect of it? We may be able to count the nodes, and infer things like the rate of spreading through the social network, but can it really learn us anything about the disease, its spreading, or how people deal with it? How can we objectively claim something about the spreading if we only count the number of carriers? “Big Data” seems to have more to do with rhetoric than ontology, and the only fever we are measuring may be quantifying fever.25 If numbers can speak,26 they do so only for themselves.

Using the Internet as some very large scientific instrument for quantifying and studying the social is problematic, and the rationale behind automation and objectivity is just as problematic (see section 5.1). It may be more fruitful to focus on discourse (following the pragmatic turn), and less on the inevitable ontological asymmetry between representation and reality. This means that we require awareness in recognising the art of “data rhetoric”27; in understanding how data is used as part of discourse in order to persuade, inform, motivate, or mobilise people and masses. This is particularly important now that data is routinely used as part of everyday practices, and issues are larger in their implications (such as in the Google Flu Trends example). The following section explores this further by reflecting on discourse surrounding the Social Graph.

5. Monetising Patterns, Cybernetics, and Normalisation

Besides scientific and statistical analysis, the Social Graph is heavily used for marketing purposes. In this final section I will develop an understanding of how the Open Graph protocol is actually a strategic addition to the Social Graph as commercial and normalising infrastructure. The Social Graph does not yet seem to have a robust epistemological foundation, but with the help of general commercial interest, it has nevertheless been able to produce a social reality of networked commerce around it. This raises more general questions about how we should treat the Social Graph; e.g. is it a community asset or is it just commodified social relations?28

5.1. Cybernetics and Commodified Social Relations

Companies like Facebook that can collect “Big Data” sets currently monetise their collected data sets through direct marketing and social commerce (Dušan 2011). Direct marketing is a form of advertising where individuals are targeted directly, and as such, relies heavily on information that enables business or non-profit organisations to effectively address members of this market. This addressability is typically based on a variety of metadata, which is then aggregated into profiles. Social commerce uses social networks to assist in the online buying and selling of products and services. A specific form of the latter is called f-comm (for Facebook commerce), which regards the buying or selling of goods or services directly on Facebook or through the Facebook Open Graph (Marsden 2011). Both forms of monetisation regard the earlier point made by Ceglowski that the Social Graph can be used to “sell you crap” (2011). Even though the Social Graph doesn't seem to have a robust epistemological foundation, it could produce a specific socio-economic reality of social shopping, user-generated advertorial content, and techno-economic reality of the like economy (Gerlitz and Helmond 2013), and so on. As a consequence, the Facebook platform is effectively contributing to building the Web as a cybernetic system that can be used to predict valuable outcomes. The Social Graph and Open Graph protocol are both stepping-stones in creating that ecosystem.

By cybernetic system I particularly refer to the work of Norbert Wiener, who wrote Cybernetics, or Communication and Control in the Animal and the Machine (1948). As the title suggests, his interpretation of cybernetics applies to biological organisms and machines alike. Cybernetics as “the field of study concerned with communication and control systems in living organisms and machines” (OED) makes it a useful way to think about cybernetics in a sociotechnical system as well. Essential to a cybernetic system is that it contains a circular-causal relationship. That is, the system being analysed is part of a closed signalling loop where (every) action analysed is also returned as feedback, enabling it to serve as the new input. The idea of automating closed processing loops is closely related to the idea of objectivity as well, because causality would remain entirely immanent to the machine. This supposedly disconnects it from any subjective human influences during the process. Still, however, and as noted earlier, patterns emerge specifically from a given system (Dixon 198). Probability measures used to predict certain trends or shifts are only useful relative to the system that produced the data in the first place (this boils down to the notion of “online groundedness” as discussed in section 2). Conceiving of the Web as a cybernetic system is thus useful to point at the promise of (protocological) control over the network by those who govern it.

5.2. Normalisation as Strategic Direction

The Social Graph is necessarily a reductionist representation of reality. Graphs may be very useful in terms of figuring out power and the relationship to a larger social entity (cf. Chun 2011, 69), but they remain both ontologically and epistemologically problematic for grounding scientific claims. Slowly but surely, we see Facebook struggling to move to the background, that is it tries to become an infrastructure rather than a platform. One great example of this move is the Open Graph protocol29, which enables developers to integrate web objects into the Social Graph. This move gives graph objects the same functionality as other graph objects like profile links, or stream updates for connected users.30 The interesting thing here is that this protocol is an external extension to the platform itself. That is, it extends beyond the limits of the platform.31 In this attempt to link the platform to the rest of the web, Facebook is taking over (social) searches from Google as a result (McCarthy 2010).

In light of Michel Foucault's concept of normalisation ([1978] 1990), the Social Graph can be understood as an attempt to make this representational model (i.e. ideal) “normal” as well. As Foucault used the term, normalisation is the process by which an idealised norm of conduct is constructed (Foucault [1977] 1995), and as such it is one of an ensemble of tactics for exerting what he calls disciplinary power. Normalisation is disciplinary because it rewards those who conform to the ideal, and punishes those who deviate from it. On Facebook, this effectively translates as the network effect; by virtue of inclusion you will be rewarded with the advantages that the network provides to its members. It is also interesting to note the project is hosted on a public open source repository on GitHub.32 One might wonder to which extent this will help the Social Graph to become a normalised digital infrastructure. Put differently, one might ask how “social integration” of a new software is linked to open source and distributed development models. Any protocol, graph, system, or infrastructure necessarily frames and constrains possibilities, raising questions such as whether there is a correlation between the normalisation of an infrastructure and the amount of power that resides in controlling that infrastructure.

6. Conclusions

By developing a historical and discursive understanding of the Social Graph, this paper aims to contribute to our current understanding of the relationship between technical protocols and real-world social, economic, and scientific discourse. In the first section, I developed a brief historical trajectory of an epistemology for the virtual, tracing Rogers' current understanding of “online groundedness” back to the cyber-libertarian division between the virtual and the real. In the second section, this epistemological foundation was connected to the pragmatic notion of network as an epistemological device. However, as such it was also problematised by the idea that graph topologies are necessarily arbitrary, and that relationships are turned into coded concepts in the context of the Social Graph. The social graph turned out to be neither social nor a graph, and should be the beginning, rather than the end of the analysis. Based on these conclusions, I turned to a different facet of the practical application of the Social Graph; namely Big Social Data analysis. Criticising an epistemology of patterns in this context, I argued that these patterns still emerge specifically from a given system. As a result, claims are still relative to this “online groundedness” of the system, which means that power still resides in controlling that system. In the final section, this understanding was applied to Facebook as a commercial platform. The Open Graph, as a supplementary and enforcing protocol, was presented as a normalising strategic direction towards developing a cybernetic infrastructure that rewards those who conform to this commercial ideal, while punishing those who deviate from it. This raises the broader concern of how we should treat the Social Graph; e.g. should it be a community asset, or is it just commodified social relations? More generally, I suggest a level of caution when considering the relationship between methodological pragmatism and discursive practices it allows to be produced.

7. Acknowledgements

I would like to thank Inge Ploum for her suggestions during the writing process, and Michael Stevenson for his inspiring enthusiasm in delivering his lectures on Code and Culture.

8. Endnotes

1. See also Ploum 2009.

2. On a practical note, it also means that for Hayles, studying medium-specificity is comparative between different media characteristics (e.g. hypertext) as they appear in other media (cf. Rogers 2009, 11).

3. See <http://www.virtualsociety.org.uk/>.

4. In reference to Dibell's description of a rape in cyberspace, one might argue that from a medium-specific point of view it is real to the extent that this particular action has happened, the machine has processed the cyber rape, but it is only because we attribute (a socially-constructed) meaning to it that we think it should be punished. It is as if the rape is “digitised” in terms of discourse, as opposed to being “natively digital” (with its own internal discourse). Instead of pondering over the meaning of that action, we could ask different questions that follow the medium.

5. The economic inequality between different social groups in terms of access to, use of, or knowledge of information technologies would become known as the digital divide (cf. Rogers 2009, 6).

6. In business jargon, this phenomenon is captured by the concept glocalism (e.g. see Friedman 2005).

7. As represented by the US-based Software Studies Initiative, directed by Lev Manovich. The initiative studies digitised objects using computational methods. See also <http://lab.softwarestudies.com/>.

8. As represented by the Amsterdam-based Digital Methods Initiative, directed by Richard Rogers. See also <http://www.digitalmethods.net/>.

9. In the sense that it becomes “citational” (Derrida 1988, 18; Austin 1962, 104; see also Butler 1990), that is it develops by iteration into a recognisable form.

10. As we learn from Geoffrey C. Bowker and Susan Leigh Star in their seminal work Sorting Things Out: Classification and its Consequences, these categories are very real in their consequences. Because classifications become infrastructures in practice, a name can suddenly make a huge difference in terms of comparibility, visibility and control (1999, 231–232; ch. 7).

11. Although first published in 1967 in Psychology Today, a more rigorous report was published two years later in Sociometry co-authored by his student Jeffrey Travers.

12. Sometimes attributed to Johann Wolfgang von Goethe (1749–1832).

13. At the Facebook f8 conference in San Francisco, California, USA, on May 24, 2007.

14. See also Bucher 2012 for an analysis of Facebook's EdgeRank algorithm.

15. Considering that you only need an average of six “degrees of separation” to reach anyone, this greatly increases the level of publicness, which in turn shapes the topology of the network.

16. See also Bezaitis 2013.

17. Since the latest interface redesign, users are repeatedly asked to enter their “missing information” to enhance the Facebook experience. Empty fields are now presented in such a way that it allows the user to directly enter the data from the timeline view (as opposed to moving to the “About” section, clicking “Edit”, and finally entering the data).

18. “Introducing Graph Search.” Facebook. Facebook, Inc., 2013. Web. 15 May 2013. <https://www.facebook.com/about/graphsearch>.

19. For the time being, some actual graph searches can be found online. See Tom Scott. “Actual Facebook Graph Searches.” Tumblr. Tumblr, Inc., 23 Jan. 2013. Web. 3 June 2013. <http://actualfacebookgraphsearches.tumblr.com/>.

20. Ceglowski ends his argument with an interesting suggestion that collecting huge amounts of data can actually harm the freedom of figuring out the next stage in the development of the Web. Open data is not a solution here, because “it misses the point of having such a permanent record in the first place” (2011).

21. Alexander, Christopher W. The Timeless Way of Building. Vol. 1. Oxford: Oxford University Press, 1979. Print. Center for Environmental Structure Series.

22. Google Flu Trends. Google.org, 2008. Web. 15 May 2013. <http://www.google.org/flutrends/>.

23. Flu Near You. Healthmap, 2013. Web. 15 May 2013. <https://flunearyou.org/>.

24. For a French initiative, see also GrippeNet.fr. L'Unité Mixte de Recherche, 2012. Web. 15 May 2013. <https://grippenet.fr/>.

25. For example, Bruno Latour noted that “sociology has been obsessed by the goal of becoming a quantitative science” (2009; cf. boyd and Crawford 2012, 666).

26. An interesting choice of metaphor when considering the idea that speech is to thought as embodiment is to pattern.

27. Aristotle called rhetoric an art, because it ultimately appeals to the emotion.

28. See also Dušan 2011.

29. Originally created at Facebook, and currently also used by Google and mixi. See Open Graph Protocol. Open Web Foundation, 18 Nov. 2012. Web. 5 June 2013. <http://ogp.me/>.

30. “Open Graph.” Facebook Developers. Facebook, Inc., 5 June 2013. Web. 5 June 2013. <https://developers.facebook.com/docs/opengraph/>.

31. As Carolin Gerlitz and Anne Helmond argue, this has happened first with the introduction of the Like buttons, then with the Open Graph protocol, and now most recently also with new possibilities of app development (2013).

32. “open-graph-protocol.” GitHub. GitHub, Inc., n.d. Web. 5 June 2013. <https://github.com/facebook/open-graph-protocol>.

9. Bibliography

Alexander, Christopher W. The Timeless Way of Building. Vol. 1. New York: Oxford University Press, 1979. Print. Center for Environmental Structure Series.

Alexander, Christopher W., Sara Ishikawa, and Murray Silverstein. A Pattern Language: Towns, Buildings, Construction. Vol. 2. New York: Oxford University Press, 1977. Print. Center for Environmental Structure Series.

Aristotle. Rhetoric. Trans. W. Rhys Roberts. Mineola: Dover Publications, 2004. Print.

Austin, John L. How To Do Things With Words. Oxford: Oxford University Press, 1962. Print.

Barabási, Albert-László. “Viruses and Fads.” Linked: How Everything Is Connected to Everything Else and What It Means. New York: Perseus, 2002. 123–142. Print.

Barlow, John Perry. “A Declaration of the Independence of Cyberspace.” Electronic Frontier Foundation. Electronic Frontier Foundation, 8 Feb. 1996. Web. 3 May. 2013. <https://projects.eff.org/~barlow/Declaration-Final.html>.

———. “Being in Nothingness Virtual Reality and the Pioneers of Cyberspace.” Mondo 2000 1990: 34–43. Print.

Barnes, John A. “Graph Theory and Social Networks: A Technical Comment on Connectedness and Connectivity.” Sociology 3.2 (1969): 215–232. Print.

Bezaitis, Maria. “The Surprising Need for Strangeness.” TED@Intel. Long Beach, California, USA. Apr. 2013. Conference Presentation. <http://www.ted.com/talks/maria_bezaitis_the_surprising_need_for_strangeness.html>.

Bijker, Wiebe E., and John Law, eds. Shaping Technology/Building Society: Studies in Sociotechnical Change. Cambridge, MA: The MIT Press, 1992. Print. Inside Technology.

Bowker, Geoffrey C., and Susan Leigh Star. Sorting Things Out: Classification and Its Consequences. Cambridge, MA: The MIT Press, 1999. 195–225. Print. Inside Technology.

Bowker, Geoffrey C. “What's in a Patent?” Shaping Technology/Building Society: Studies in Sociotechnical Change. Cambridge, MA: The MIT Press, 1992. 53–74. Print. Inside Technology.

boyd, danah m., and Kate Crawford. “Critical Questions for Big Data.” Information, Communication & Society 15.5 (2012): 662–679. Print.

———. “Six Provocations for Big Data.” A Decade in Internet Time: Symposium on the Dynamics of the Internet and Society, Oxford, 21 Sept. 2011. Print.

Bucher, Taina. “Want to Be on the Top?: Algorithmic Power and the Threat of Invisibility on Facebook.” New Media & Society 14.7 (2012): 1164–1180. Print.

Burgess, Jean, and Axel Bruns. “Twitter Archives and the Challenges of ‘Big Social Data’ for Media and Communication Research.” M/C Journal 15.5 (2012): n. pag. Web. 3 May 2013. <http://www.journal.media-culture.org.au/index.php/mcjournal/article/viewArticle/561>.

Butler, Declan. “When Google Got Flu Wrong: US Outbreak Foxes a Leading Web­Based Method for Tracking Seasonal Flu.” Nature. Nature Publishing Group, 13 Feb. 2013. Web. 12 May 2013. <http://www.nature.com/doifinder/10.1038/483520a>.

Butler, Judith. Gender Trouble: Feminism and the Subversion of Identity. London/New York: Routledge, 1990. Print. Thinking Gender.

Castells, Manuel. “A Network Theory of Power.” International Journal of Communication 5 (2011): 773–787. Print.

Ceglowski, Maciej. “The Social Graph Is Neither.” Pinboard Blog. Nine Fives Software, 8 Nov. 2011. Web. 3 May 2013. <http://blog.pinboard.in/2011/11/the_social_graph_is_neither/>.

Chun, Wendy Hui Kyong. Programmed Visions: Software and Memory. Cambridge, MA: The MIT Press, 2011. Print. Software Studies.

Cox, Geoff, and Adrian Ward. “Perl.” Software Studies: A Lexicon. Ed. Matthew Fuller. Cambridge, MA: The MIT Press, 2008. 207–212. Print. Software Studies.

Crawford, Kate. “Think Again: Big Data.” Foreign Policy. The Foreign Policy Group, 9 May 2013. Web. 12 May 2013. <http://www.foreignpolicy.com/articles/2013/05/09/think_again_big_data>.

Derrida, Jacques. Limited Inc. Evanston: Northwestern University Press, 1988. Print.

Dibbell, Julian. “A Rape in Cyberspace.” Village Voice. Village Voice, LLC, 18 Oct. 2005. Web. 29 Mar. 2013. <http://www.villagevoice.com/2005-10-18/specials/a-rape-in-cyberspace/full/>.

van Dijck, José F. T. M. The Culture of Connectivity: A Critical History of Social Media. Oxford: Oxford University Press, 2013. Print.

Dixon, Dan. “Analysis Tool or Research Methodology: Is There an Epistemology for Patterns?” Understanding Digital Humanities. Ed. David M. Berry. Basingstoke: Palgrave Macmillan, 2012. 191–209. Print.

Downes, Stephen. “Places to Go: Facebook.” Journal of Online Education 4.1 (2007): 1–5. Print.

Duling, Dennis. “The Jesus Movement and Social Network Analysis (Part I: The Spatial Network).” Biblical Theology Bulletin: A Journal of Bible and Theology 29.4 (1999): 156–175. Print.

Dušan, Barok. “Privatising Privacy: Trojan Horse in Free Open Source Distributed Social Platforms.” Rotterdam: Networked Media, Piet Zwart Institute, 2011. Print.

Fitzpatrick, Brad, and David Recordon. “Thoughts on the Social Graph.” Brad Fitz. 17 Aug. 2007. Web. 3 May 2013. <http://bradfitz.com/social-graph-problem/>.

Fonseca, Andrea Esperanza. Contemporary Network Theory: Concepts and Implications for Transportation Asset Management. Virginia Polytechnic Institute and State University, 2007. Print.

Foucault, Michel. Discipline & Punish. Trans. Alan Sheridan. New York: Vintage Books, 1995. Print.

———. “Of Other Spaces: Utopias and Heterotopias.” Trans. Jay Miskowiec. Architecture /Mouvement/ Continuité Oct. 1984: 46–49. Print.

———. The History of Sexuality: An Introduction. Vol. 1. 1978. London: Vintage Books, 1990. Print.

Friedman, Thomas L. The World Is Flat: A Brief History of the Twenty-First Century. New York: Farrar, Straus and Giroux, 2005. Print.

Galloway, Alexander R. Protocol: How Control Exists After Decentralization. Cambridge, MA: The MIT Press, 2004. Print. Leonardo Books.

Gerlitz, Carolin, and Anne Helmond. “The Like Economy: Social Buttons and the Data-Intensive Web.” New Media & Society (2013): 1–18. Print.

Hayles, N. Katherine. How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. Chicago: University of Chicago Press, 1999. Print.

———. “Virtual Bodies and Flickering Signifiers.” October 66 (1993): 69–91. Print.

Kelly, Kevin. Out of Control: The New Biology of Machines, Social Systems, & the Economic World. New York: Basic Books, 1995. Print.

Krebs, Valdis. It's the Conversations, Stupid!: The Link between Social Interaction and Political Choice. San Francisco: Commonweal Institute, 2005. Print.

Kuhn, Thomas S. The Structure of Scientific Revolutions. Chicago: University of Chicago Press, 1996. Print.

Lacan, Jacques. Le Séminaire, livre XX: Encore. Paris: Seuil, 1975. Print.

Latour, Bruno. “Tarde's Idea of Quantification.” The Social after Gabriel Tarde: Debates and Assessments. Ed. Matei Candea. London: Routledge, 2009. 145–162. Print.

———. Reassembling the Social: An Introduction to Actor-Network-Theory. 2005. Oxford: Oxford University Press, 2007. Print. Clarendon Lectures in Management Studies.

Latour, Bruno, and Steve Woolgar. Labratory Life: The Social Construction of Scientific Facts. Princeton: Princeton University Press. Print.

Mackenzie, Adrian. “Codecs.” Software Studies: A Lexicon. Ed. Matthew Fuller. Cambridge, MA: The MIT Press, 2008. 48–54. Print. Software Studies.

Madrigal, Alexis C. “How Facebook Designs the ‘Perfect Empty Vessel’ for Your Mind.” The Atlantic. The Atlantic Monthly Group, 2 May 2013. Web. 3 May 2013. <http://www.theatlantic.com/technology/archive/2013/05/how-facebook-designs-the-perfect-empty-vessel-for-your-mind/275426/>.

Marsden, Paul. “The F-commerce FAQ [Download] All You Ever Wanted to Know About Facebook Commerce But Were Afraid to Ask.” Social Commerce Today. Social Commerce Today, 18 Apr. 2011. Web. 5 June 2013. <http://socialcommercetoday.com/f-commerce-faq-all-you-ever-wanted-to-know-about-facebook-commerce-but-were-afraid-to-ask/>.

McCarthy, Caroline. “Google vs. Facebook: Drawing the Battle Lines.” CNET. CBS Interactive, Inc., 5 Aug. 2010. Web. 5 June 2013. <http://news.cnet.com/8301-13577_3-20012839-36.html>.

Ploum, Inge J. “Understanding Embodiment in the Age of Virtuality.” Inge Ploum's Research Blog. WordPress Foundation, 17 Feb. 2009. Web. 6 June 2013. <http://ijploum.wordpress.com/2009/02/17/understanding-embodiment-in-the-age-of-virtuality/>.

Rieder, Bernhard. “What Is in PageRank? A Historical and Conceptual Investigation of a Recursive Status Index.” Computational Culture 2 (2013): n. pag. Web. 13 May 2013. <http://computationalculture.net/article/what_is_in_pagerank>.

Rogers, Richard A. The End of the Virtual: Digital Methods. Amsterdam: Vossiuspers UvA, 2009. Print.

de Saussure, Ferdinand. Course in General Linguistics. Trans. Wade Baskin. New York: Columbia University Press, 1965. Print.

Scott, John. “Social Network Analysis.” Sociology 22.1 (1988): 109–127. Print.

Shaviro, Steven. “Money for Nothing: Virtual Worlds and Virtual Economies.” Unpublished Manuscript, 2007.

Slater, Don, and Daniel Miller. “Conclusions.” The Internet: An Ethnographic Approach. Oxford: Berg, 2000. 1–13. Print.

Stone, Zak, Todd Zickler, and Trevor Darrell. “Autotagging Facebook: Social Network Context Improves Photo Annotation.” Proceedings of the 8th 2017 International Conference on Social Media IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPRW ’08) (2008): 1–8. Print.

Suchman, Lucy A. “Do Categories Have Politics?” Computer Supported Cooperative Work (CSCW) 2 (1993): 177–190. Print.

———. Plans and Situated Actions: The Problem of Human-Machine Communication. Palo Alto: Xerox Corporation, 1985. Print.

Sunstein, Cass. “Fragmentation and Cybercascades.” Republic.com. Princeton: Princeton University Press, 2001. 51–88. Print.

“The Open Graph Protocol” Open Graph Protocol. Open Web Foundation, 18 Nov. 2012. Web. 5 June 2013. <http://ogp.me/>.

Travers, Jeffrey, and Stanley Milgram. “An Experimental Study of the Small World Problem.” Sociometry 32.4 (1969): 425–443. Print.

Ugander, Johan, et al. “The Anatomy of the Facebook Social Graph.” Computing Research Repository (CoRR) (2011): 1–17. Print.

Wiener, Norbert. Cybernetics, or Communication and Control in the Animal and the Machine. 1948. 2nd ed. Cambridge, MA: The MIT Press, 1961. Print.

Wolfram, Stephen. A New Kind of Science. Champaign: Wolfram Media, 2002. Print.

———. “Data Science of the Facebook World.” Wolfram|Alpha Blog. Wolphram Alpha LLC, 24 Apr. 2013. Web. 3 June 2013. <http://blog.wolframalpha.com/2013/04/24/data-science-of-the-facebook-world/>.

Woolgar, Steve. “Five Rules of Virtuality.” Virtual Society? Technology, Cyberbole, Reality. Ed. Steve Woolgar. Oxford: Oxford University Press, 2002. 1–22. Print.

Info
Title: What's in the Social Graph?
Subtitle: An Enquiry into a Technical Protocol
Type: Research article; Assignment
Author.name: F. N. (Fernando) van der Vlist
Author.affiliation: College of Humanities, Faculty of Humanities, University of Amsterdam
Instructor.name: I. J. (Inge) Ploum; Dr. M. P. (Michael) Stevenson
Instructor.affiliation: Dept. of Media Studies, Faculty of Humanities, University of Amsterdam
Abstract: In this paper I develop a historical and discursive understanding of the Social Graph. It develops two lines of investigation: first, an epistemological debate is traced that attempts to solve issues of the real-virtual divide. Based on this historical trajectory, a methodological pragmatism becomes apparent that also appears in graph theory as applied to the Social Graph. “Big Data” undermines these issues of a divide by relying on an epistemology of patterns. However, these patterns turn out to be grounded online as well. Second, this epistemological trajectory is connected to current social, economic, and scientific discourse – situating Big Social Data analysis, the Social Graph, and the Open Graph protocol. To this end, Facebook's Social Graph is used as a case study throughout. Ultimately, I argue that the Open Graph, as a supplementary and enforcing protocol to the Social Graph, represents Facebook's normalising strategic direction towards developing a cybernetic infrastructure that rewards those who conform to its commercial ideals, while punishing those who deviate from it.
Keywords: technical protocol, epistemology, discourse analysis, virtuality, Social Graph, Big Social Data
Length.words: 5,846
Length.reading: 33 mins
Date.submitted: 7 June 2013
Date.evaluated: 18 June 2013
Date.publishedonline: 13 July 2013
Language: English (United Kingdom)
Documentation.style: Modern Language Association (7th ed.)
Export.citation: BibTEX
Export.print: javascript:window.print()
2012– fernandovandervlist.nl
v1.2.29