Skip to main content
Research

Beyond Surveillance: How Do Markets and Algorithms "Think"?

Author: Bernhard Rieder (New Media and Digital Culture, University of Amsterdam)

  • Beyond Surveillance: How Do Markets and Algorithms "Think"?

    Research

    Beyond Surveillance: How Do Markets and Algorithms "Think"?

    Author:

Abstract

This paper draws on Foucault's work to inspire and inform a conceptual investigation into the relationship between economic thinking, in particular concerning markets, and contemporary computing. This investigation takes the form of three intellectual probes. The first of these probes applies to the computer's capacity to lower transaction costs, the second proceeds through the framing of markets (and algorithms) as places of truth, and the third draws on Deleuze's reading of Foucault to inquire into the notion of order and ordering. Together, the three lines of inquiry attempt to outline an encounter between Foucault's thinking and computing that moves beyond the question of surveillance in favor of an emphasis on questions of epistemology.

Keywords: computing, epistemology, markets, neoliberalism, order, Deleuze, Foucault

How to Cite: Rieder, Bernhard. "Beyond Surveillance: How Do Markets and Algorithms 'Think'?" Le foucaldien 3, no. 1 (2017): 1–20. DOI: https://doi.org/10.16995/lefou.30 [Note: In 2022, Le foucaldien relaunched as Genealogy+Critique.]

4213 Views

1034 Downloads

4 Citations

Published on
2017-09-01

Peer Reviewed

1. Introduction

The decades since World War II have seen several waves of "computerization" that involved considerable changes and extensions in terms of what computers can do and how they look like as well as significant incursions of computing technology into almost all domains of life. Each wave has raised a different set of issues and prompted the interest of scholars from the social sciences and humanities in different ways. One of the threads that run through the various stages is the question how computers affect human communication and, consequently, social relationships and forms of organization. While earlier concerns such as the computerization of work processes, the emergence of virtual communities, or the rise of user generated content are still being discussed heavily, the issue that has probably received the most attention in recent years can be summarized under the moniker "big data", which includes the broader movement of "datafication" – "the ability to render into data many aspects of the world that have never been quantified before"1 – and the associated proliferation of algorithmic techniques that are supposed to make sense of the torrents of available information. Like in so many other domains, Foucault's work has proven inspirational and useful to interpret these technological phenomena against the backdrop of broader cultural configurations and developments. One could argue, however, that the focus has been squarely on the more obviously applicable parts of Foucault's work on surveillance and less on the philosopher's more recent work on governmentality. This latter notion has certainly received significant attention in its own right, but the role of media and computing technologies in the emergence of neoliberalism has not been a major theme, with the consequence that "the governmentalities of such technologies remain under-theorized".2 Authors like Amoore,3 Cheney-Lippold,4 and others have shown, however, that an investigation of the specific technological features of computing through the lens of Foucault's later work allows for an appraisal of the more subtle ways algorithmic data processing intervenes in various cultural domains.

This paper will not attempt to summarize existing work proceeding along these lines, but rather launch three connected probes that draw on Foucault's work in order to develop possible conceptual connections between computing technology and a central notion of (neo)liberal economic and political thought, that is, the market. My goal is to approach this relationship through the angle of "technicity",5 that is, through a perspective that considers the technical properties of artifacts to be more than epiphenomena of their social, cultural, or ideological embeddings, to possess their own specific features, logics, and performativities, and, therefore, to be worthwhile of critical appraisal in their own right. These technicities, I will argue, can be seen as supporting the actual implementation of neoliberalism in a very concrete sense, but they can also prompt us to inquire into the deeper epistemological character of market economics itself.

2. Computing and the Market Form

The first line of inquiry I would like to develop connects to a central point Foucault makes in his reading of German Ordoliberalism, which, together with his discussion of the Chicago school, forms what could be called a genealogy of neoliberal thought. Here, Foucault argues that, in contrast to classic economic liberalism, these later currents see well-functioning markets not as the natural outcome of laissez-faire, but as something that needs to be assured through constant governmental action. The market has to be organized and protected to be able to operate efficiently. One needs to "govern for the market rather than because of the market"6 and, in that sense, the market is something that has to be produced. But how does one produce markets? There is obviously no singular answer to this question, but several authors have suggested that information technology facilitates organization and coordination through market forms of interaction.

This argument was initially brought forward by Ciborra,7 who argued that the two prevalent ways to theorize the role of computers, the "data view" and the "decision-making view", needed to be complemented by what he called the "transaction cost approach".8 This approach builds on Coase's9 fundamental interrogation concerning the nature of the firm, which asks why firms exist in the first place when most economists think of the economic system as essentially being coordinated by the pricing mechanism, i.e. the encounter of supply and demand. Why would one create a complicated hierarchical structure organized around long-term employment if everything needed to produce a commodity or service can be bought or contracted on markets for goods and labor? According to Coase, firms exist because "there is a cost of using the price mechanism",10 namely the cost of discovering relevant prices, the cost of making contracts, and the cost of controlling outcomes and adjusting for changing conditions when contracts exceed a certain time span. In short, both firms and markets incur "transaction costs" when organizing economic activity and in many situations the firm is the more efficient organizational form.

Based on Coase's analysis and Williamson's extended formulation, Ciborra develops a perspective that frames information systems as "mediating technologies"11 and highlights their role "as a means to streamline exchange transactions".12 In direct reference to Coase's three cost factors for using the price mechanism, he argues:

The costs of organizing, i.e. costs of coordination and control, are decreased by information technology which can streamline all or part of the information processing required in carrying out an exchange: information to search for partners, to develop a contract, to control the behavior of the parties during contract execution and so on.13

While Ciborra is mainly interested in developing a conceptual and methodological matrix for analyzing and theorizing the role of information technologies in organizations, he prepares a broader discussion of social relevance by arguing that these technologies are, indeed, "a means for creating/expanding markets, by lowering search, contracting and control costs".14 Information technology can thus be seen as a means for supporting and enabling markets. The "ride sharing" company Uber can serve as the canonical example here. The central element of Uber's business are the apps it proposes to both riders and drivers as well as the data infrastructures that these apps connect to. They match buyers and sellers in physical space, implement a contractual framework through terms of service, handle payment, and constantly monitor participants' behavior through a series of feedback mechanisms. Uber can thereby organize the economic activity of providing personal transportation on demand through a market rather than a firm, at least for the part concerning the actual driving.

Agre15 later significantly extends Ciborra's argument by showing, in detail, how information technology is capable of lowering transaction costs through "capture", that is, through forms of grammatization and disambiguation that apply when activities become mediated by computers. In the case of Uber, the apps and their backend structure the transaction in all respects, from the formalization of physical space through detailed maps and GPS functionality to the fully automated process of pricing and payment. Agre again emphasizes the resulting expansion of market forms of organization:

In other words, by imposing a mathematically precise form upon previously unformalized activities, capture standardizes those activities and their component elements and thereby prepares them […] for an eventual transition to market-based relationships.16

Agre's broader societal perspective goes as far as connecting the effects of computerization to themes often associated with neoliberal governance, such as the growth in temporary employment or the trend towards outsourcing of noncore functions,17 which also figure prominently in the work of authors analyzing the changing fabric of network economies.18

When looking beyond Uber at some of the most successful online ventures of the last decade, we notice that their common blueprint involves using information technology to create or support some kind of marketplace where units of various kinds are made available. The Web makes documents available. Amazon makes consumer goods available. Spotify and Netflix, respectively, make music and audiovisual contents available. Uber, indeed, makes units of transportation available, AirBnB of housing. Facebook, OkCupid, Meetup, and Monster all make people available, even if they do so quite differently. Since these online marketplaces often dominate their specific niche and are generally much less limited in geographical and logistical terms than their offline equivalents, they tend to host large numbers of units and participants. Data collection, algorithmic filtering, and activity-oriented grammatization on the interface level are central to how they function and even necessary for them to function in the first place. Foucault's assessment that neoliberalism sees competition rather than exchange as the essential feature of markets19 holds for most online marketplaces as well. The units made available through the above-mentioned platforms are vying for attention and the omnipresence of counters and rankings that measure and evaluate popularity, relevance, or authority is essential to reinforcing a state of permanent competition where everyone is constantly updated on where they stand. The goal is to activate the individual to become an "entrepreneur of himself" (entrepreneur de lui-même) through constant investment in their human capital.20 Software, again, is used to formalize and disambiguate notions of value and the resulting value signals are both directed to market participants and ranking algorithms. The ratings Uber drivers receive from riders may certainly lead clients to cancel a trip or the company to cancel an account, but they could also be used as a factor in the automatic matching technique used to connect buyers and sellers, so that negative reviews lead to fewer pickup opportunities. Unsurprisingly, Uber provides a detailed list of tips for how drivers can keep their ratings high.21 In Foucault's terms, we could argue that this is where markets and disciplines intersect.

According to Gane, similar objectives are being applied to society at large "through the mobilization of new forms of governmental intervention that are designed to inject regulatory principles of competition into all forms of social life and culture – principles that are mobilized through new techniques of audit and classification".22 Assessments along these lines have become almost commonplace, but the role information technology plays in this context remains underexplored. As Ciborra's transaction cost approach shows, the focus on quantification and evaluation is not enough; we also need to consider the computer's capacity to enable, support, and orchestrate market-based relationships. There is a more fundamental "effect" on forms of social organizational that, through the lens of Foucault's conceptual apparatus, opens a road towards a deeper analysis of how computers and markets mingle.

3. Places of Truth

The second probe approaches this relationship from a different angle and asks the question how thinking about markets as "places of truth" (lieux de verité) can help us understand certain aspects of contemporary computing. In his lectures on the birth of biopolitics (Naissance de la biopolitique), Foucault approaches liberal and neoliberal governmentality not just on the level of economic normativity, but pays significant attention to the more fundamental epistemological underpinnings of these "arts of governing" (arts de gouverner). This means, first and foremost, that he studies them in a fashion similar to earlier projects as "regimes of truth" (régimes de vérité), that is, as series of discursive practices that form intelligible ensembles, which delineate objects and methods of inquiry, problem spaces, approaches to explanation, and the conditions for distinguishing true and false.23 Although he no longer uses the term épistémè – another concept concerned with the delineation of discourses or knowledge formations – at this point, we can connect the notion of governmentality without too much difficulty to Foucault's earlier investigation, to be found in The Order of Things,24 into the emergence of political economy as part of the transition from the classic to the modern regime of truth. Looking a little further into this background is useful for situating the market concept more sharply.

The birth of political economy – or simply of modern economics – around the end of the 18th century inscribes itself into the shift from the age of representation, organized around categorization and the attribution of difference and identity, to the modern épistémè, which "thinks" through epistemic forms that revolve around contingent forces unfolding over time. In the case of economics, this means moving from a science of wealth to one that identifies labor and production as the essential drivers of prosperity. Whereas the value of an item was previously based on an assessment of similarities and differences with other items, Smith and Ricardo identify the hours of work necessary to produce something as the "true" foundation of its value. According to Foucault,25 Ricardo fully shifts the center of attention to the process of production by considering not just the direct work going into a certain product, but also all the preparatory or auxiliary work that runs through integrated yet specialized instances of larger systems of production, such as the provision of primary materials, tools, and organizational structures. In this move, production – and through it economics as a discipline – acquires a certain depth or "thickness", in the sense that it is identified as having its own specific principles and regularities, its own process of contingent historical development.

We find a parallel shift in Security, Territory, Population,26 the lecture series that precedes the already mentioned Birth of Biopolitics and shares its preoccupation with questions of governmentality. Here, Foucault discusses an epistemological break in the art of governing that revolves around the emergence of the problem of the population. This break is again central to political economy since the population replaces the family as the central model for pondering the government of a nation.27 In similar fashion to the "thickening" of the concept of production, the population is now attributed a specificity of its own, that is, its own irreducible laws, regularities, and dynamics, which manifest through phenomena like epidemics or the various self-reinforcing spirals observable in the economy. Certainly, Foucault attributes the "discovery" of the population to physiocrates like Quesnay and Turgot rather than to Smith and Ricardo and singles out the nascent discipline of statistics as crucial to the conceptualization of this newly identified entity, but the two threads form a coherent whole.

The market becomes a place of truth at the precise moment when production and population are being understood through the lens of an economic "naturalism"28 as forces that have their own properties and "spontaneous mechanisms",29 since the market is the principal concept to conceive how these forces play out over time. According to Foucault,30 the market thus emerges in liberal economics as the place where these mechanisms engage in their natural interactions, which must therefore not be altered, but also as the place where the interplay of supply and demand yields the "right" or "true" price.31 The market is thus not just seen as a place of exchange, but also as a "calculative device"32 that generates what could be called a "knowledge product", an information that economists consider to be central to market participants' decision making concerning investment, production, and distribution. Connecting back to the previous section, it is the price mechanism that makes it possible for Coase and others to think about markets as systems for coordination in the first place.

I would like to argue that investigating markets as places of truth in this two-fold way, as both arenas for interactions and as calculative devices, opens up interesting ways to scrutinize both the realities and the imaginaries of contemporary computing. There would be much to learn from an exploration of the homology and cross-fertilization between economics and the different types of algorithms that generate models or calculate metrics based on the iterative processing of atomized entities. In previous work,33 I have tried to show how algorithms like PageRank draw on economic thinking to fuse the linking behavior of millions of web page creators into a singular measure of authority. But we see similar approaches in recommendation engines, price auction systems, machine learning techniques, and computer simulations. In all of these instances, we find both the calculation of some emergent property (e.g. a ranking or a classification) and a naturalizing narrative that justifies the calculative procedure. Much like in economics, these narratives are often either axiomatic or fundamentally vague. In the case of Uber, the justification for dynamic pricing of rides relies on a textbook explanation for how the pricing mechanism is thought to coordinate economic activity:

Dynamic pricing may cause fares to temporarily increase. This encourages more drivers to get on the road and head to areas of the city where demand for rides is higher than drivers' ability to accommodate all the ride requests happening at any moment.34

While this is not explicit in Foucault's reading, the factors going into the price mechanism have been heavily debated and economists sometimes emphasize the cost of production on the supply side or subjective notions of utility on the demand side. But independently of how we think it really works, the market as mechanism always yields a price as long as there are market participants engaging in transactions. Truth, in that sense, has not only an epistemological, but also a logistical dimension. What unites markets and many contemporary algorithms is that they emphasize the production side of truth over its epistemological counterpart. In the most radical formulations of marginalist theory, the question of what the value of an item is ultimately based on becomes irrelevant. The right price is the one set by the market, because the market fuses all of the subjective and idiosyncratic reasons people might have to want to buy something into a single value.

In his attempt to supplement and extend Foucault's genealogy of neoliberalism, Gane35 rightfully insists on the importance of Hayek's contribution, which helps to clarify the argument I am trying to develop. Hayek's endorsement of competitive markets as mechanisms for coordination revolves around "the unavoidable imperfection of man's knowledge and the consequent need for a process by which knowledge is constantly communicated and acquired".36 For Hayek, the price mechanism is the (only) process that is capable of bringing together the "dispersed bits of incomplete and frequently contradictory knowledge"37 and it is therefore preferable to any form of conscious direction. What we find both in neoliberal economics and in any area where decision-making is shifted from individuals to procedures and mechanisms is a deep distrust of the capacity of humans – as individuals or institutions – to make "good" and "fair" decisions. Algorithms, much like markets, are increasingly framed in these terms, with both epistemological and moral connotations. Based on their ability to consider very large numbers of signals, they are presented as capable of transcending individuals' "unavoidable imperfection", understood as both their limited knowledge and their self-interest.

To be clear, I am not implying that this view is "wrong". But it becomes increasingly clear that the extension of the price mechanism to more and more domains of life, invariably with the help of information technology, clashes with other systems of valuation. Uber's dynamic pricing, for example, has been repeatedly criticized,38 in particular in the context of emergency situations, when surging prices are seen as a troubling instance of immoral profiteering. And this is only the very tip of the iceberg when it comes to the many social consequences – including generalized precarity and permanent pressure to perform – the spread of market forms of organization and competition seem to imply. The next section will address this question from yet another angle.

4. The Question of Order

My third attempt to link Foucault's thinking to contemporary issues concerning computing initially departs from the later work on governmentality and the market. Drawing on Deleuze's reading of Foucault, I would like to address the question of order and ordering. Besides the often cited Post-scriptum,39 which mentions computers quite explicitly, it is the Annexe to Deleuze's 1984 book Foucault40 that serves as the starting point to my argument. In this relatively short section, Deleuze provides a highly conceptual reading of the succession of épistémè Foucault constructs in The Order of Things, which I already alluded to above.

Here, the classic épistémè is read through the notion of "unfolding" (déplier) and coupled with the forces of infinity. The logic of representation incessantly produces two-dimensional tables that define the bounds of the order of things; concrete entities do not define this space, they are merely positioned on it through the attribution of identity and difference with other entities. The modern épistémè first appears as a perturbation of the classic order. There are irreducible and contingent forces – life, work, language – that break through the preset representational grids ordering the entities these forces are entangled with. In Darwin's work, for example, there is no predefined Regnum Animale that covers all living beings and their infinite variations. On the contrary, the Tree of Life starts with a single organism and the way it evolves is contingent and dependent on interactions with the environment. There is no eternal plan or order: life sprawls in different directions through successions of abundant yet finite variations. According to Deleuze, the modern épistémè is marked by an empiricism organized around the continuous "folding" (plier) of the forces of life, work, and language. The idea that both population and production are defined by such irreducible processes connects directly to this characterization. History is not simply variation on a constant theme, but a process of becoming. In this context, the market is understood as a place of emergence where the concrete interactions of market forces produce contingent outcomes.

Rather than stopping at this point, Deleuze attempts to address a question Foucault famously evokes at the end of The Order of Things and asks what comes beyond the modern épistémè. It makes sense to quote the central passage of Deleuze's argument in full:

It took biology leaping into molecular biology, or dispersed life gathering in the genetic code. It took dispersed work assembling or regrouping in machines of the third kind that are cybernetic and informatic. What would be the forces at play that the forces in man would now enter into relation with? This would no longer be the elevation to the infinite, nor finitude, but a fini-unlimited, to thus name every situation of force where a finite number of components yield a practically unlimited diversity of combinations.41

I would like to argue that this notion of the "fini-unlimited"42 provides a compelling way to deepen our understanding of both markets and contemporary algorithmic techniques. To make this argument, it is crucial to address the question of order. According to the Oxford English Dictionary, the term denotes "the arrangement or disposition of people or things in relation to each other according to a particular sequence, pattern, or method". Foucault's épistémès are not only connected to particular visual forms of arranging, such as the table or the tree, but they contain specific ideas about the nature of order itself. In the classic period, order is thought to be pre-given, a "God-form"43 that runs through the things themselves, constantly unfolding according to eternal, unchanging principles. The scholar observes, designates, and takes inventory; and although words and things are considered to be distinct, a well-built analytical language or taxonomy keeps them from falling apart by producing a correct account of a world "given without rupture to representation".44 In the modern period, however, order is an outcome, something that is produced by the processes of life, work, and language. In Hayek's "epistemological" neoliberalism, the market orders, that is, it creates the information that will allow and incite participants to make the best decisions concerning investment, production, and distribution. Uber's surge pricing is supposed to bring more drivers to the streets. For neoliberalism, the market is thus not just a calculative device, but an ordering device that coordinates activities and flows, that picks winners and losers, that identifies what is worthy of attention. The ways online platforms rank and filter connect directly to this principle.

How does the notion of the fini-unlimited lead to a third understanding of order? The crucial element, here, is the idea that a limited number of elements can yield an almost unlimited number of combinations. For anyone acquainted with contemporary techniques for data analysis, this should ring familiar. For any sufficiently complex dataset, the idea that "the data speak for themselves" is nonsensical; we select from a wide variety of mathematical and visual methods to make the data speak, to investigate them from different angles, to answer questions that orient how we look at them. There is a guiding interest against which the data are made meaningful. A similar argument can be made in reference to what is arguably one of the most important inventions in the field of computing, the relational database model.45 The central idea informing this model is to cut data into the smallest parts possible to allow for the dynamic recombination of information at retrieval time through a powerful query language that allows for a maximum number of "views". Outputs are ordered in reference to the question asked. To cite another example, machine learning techniques, e.g. Bayes classifiers used for spam email filtering, can be seen as practical techniques for creating such views, models, or combinations: by "showing" the spam filter which mails we consider spam, the classifier "learns" to treat each word or feature in a corpus as an indicator for spamminess.46 And no two users' classifier profile will be exactly the same, not only because they receive different emails, but also because they will have different definitions of what constitutes an "unwanted" message. In short, the fini-unlimited is the very domain computers excel at. But what does this mean in the context of a discussion of markets?

For the liberal economists Foucault discusses, the market is an "unproblematic" concept, not in a political, but in an epistemological sense. A market is wherever there is an encounter between supply and demand, little more to say about it. For the neoliberals, things are more complicated, because the market is under constant threat of "distortion" and the state therefore needs to govern "for the market" to guarantee that it functions well. But it remains a mechanism. Only more recently have economists begun to explicitly assert that markets are not created equal and new fields such as "market design"47 have emerged as a consequence. Authors in these fields may still consider supply and demand to be "natural" forces, but their encounter and interactions are orchestrated by specific mechanisms that skew outcomes. The job market and the stock market may both connect offer and demand, but if we look at them in more detail, we notice that they work according to quite different rules. If we change these rules, outcomes will be affected. While Foucault's neoliberals already saw the market as something that needs to be supported by constant governmental action,48 market design starts from the idea that market mechanisms – or algorithms – are the result of implicit or explicit engineering and can take a wide variety of forms. We thus move from an understanding of markets as a singular mechanism to one that focuses on the specific ways different designs implement different mechanisms. In the case of Uber, for example, Dholakia not only suggests to market surge pricing more effectively to users, but also to cap the surge multiplier and to design the algorithm in a way that reduces fluctuation to create a clearer and more predictable price curve.49 The company has also begun to suspend surge pricing altogether in specific situations.50

From this perspective, markets still appear as ordering devices, but they move from the natural domain into the domain of ethics and politics. And the same holds for the many domains where algorithms perform ordering functions online. If we simplify Facebook for a moment as a marketplace for ideas that circulate in the form of posts, we can argue that the filtering mechanism that decides what to show in our News Feeds could be based on other principles than some form of output maximization – or an output maximization of a different kind than advertisement revenue. There is also no (technical) reason why Sunstein's call for implementing an element of serendipity51 into these filters could not be heeded. These and other choices are, indeed, choices.

The potential variation and multiplicity of views, combinations, or interaction designs puts focus squarely on the mechanism/algorithm and the specific ways it describes and intervenes. This shift sidelines the onerous debate whether algorithms are "objective" and highlights the more productive – and much more complicated – question what a fair algorithm would look like in a specific context. We can thus return to Ciborra's transaction cost approach with fresh eyes and ask how systems design not only changes the cost of transactions, but also their dynamics, outcomes, and politics.

5. Conclusions

Although these three lines of inquiry require much more work to become more than intellectual probes, I hope to have shown how Foucault's work can inform and inspire an examination of the relationship between governmentality and computing that goes beyond the question of surveillance. While my first argument concerned the capacity of information technology to concretely support the expansion of market forms of organization, the latter two focused on the broader epistemological connections between economic thinking and computing. In all of this, my goal was not to frame computers as "agents" of neoliberalism, but to ask how we can understand computers as political machines in a way that treats them as more than tools in the hands of actors or ideologies. I want to suggest that Foucault's greatest lesson was to challenge us to ask how discourses "think" – to inquire into what they hold to be true about the world and into how they inform evidence, argument, and truth – in order to help us interrogate and critique the societies we live in. And these societies are more and more organized around market economies on the one side, and computer networks on the other. What I have tried to show is that these two developments hang together in various ways, that they share epistemological commitments and reinforce each other. But I have also highlighted that markets and market forms of organization supported by information technology now appear as neither natural nor singular forms. And markets are, increasingly, designed – with the help of software preferably. This means that their specific properties are produced and amenable to change. The debate about the limits of the market logic therefore needs to be extended to include the question which logics markets can and should implement in the first place.

Notes

  1. Kenneth Cukier and Viktor Mayer-Schoenberger: The Rise of Big Data: How It's Changing the Way We Think About the World, in: Foreign Affairs, 92/3 (2013), pp. 28–40, here p. 29. [^]
  2. Nicholas Gane: The Governmentalities of Neoliberalism: Panopticism, Post-Panopticism and Beyond, in: Sociological Review, 60/4 (2012), pp. 611–634, here p. 631. [^]
  3. Louise Amoore: Data Derivatives: on the Emergence of a Security Risk Calculus for Our Times, in: Theory, Culture & Society, 28/6 (2011), pp. 24–43. [^]
  4. John Cheney-Lippold: A New Algorithmic Identity: Soft Biopolitics and the Modulation of Control, in: Theory, Culture & Society, 28/6 (2011), pp. 164–181. [^]
  5. Gilbert Simondon: Du mode d'existence des objets techniques, Paris: Aubier 1958. [^]
  6. "Il faut gouverner pour le marché, plutôt que gouverner à cause du marché." Michel Foucault: Naissance de la biopolitique, Paris: Gallimard Seuil 2004, p. 125f., author's translation. [^]
  7. Claudio U. Ciborra: Reframing the Role of Computers in Organizations. The Transaction Costs Approach, in: Lynn Gallegos, Richard Welke and James C. Wetherbe (eds.). Proceedings of the Sixth International Conference on Information Systems, Indianapolis IN, December 1985, pp. 57–69. [^]
  8. Most probably in direct reference to Oliver E. Williamson: The Economics of Organization: The Transaction Costs Approach, in: American Journal of Sociology, 87/3 (1981), pp. 548–577. [^]
  9. Ronald H. Coase: The Nature of the Firm, in: Economica, 4/16 (1937), pp. 386–405. [^]
  10. Coase: The Nature of the Firm, p. 390. [^]
  11. Ciborra: Transaction Costs Approach, p. 63. [^]
  12. Ciborra: Transaction Costs Approach, p. 57. [^]
  13. Ciborra: Transaction Costs Approach, p. 63. [^]
  14. Ciborra: Transaction Costs Approach, p. 63. [^]
  15. Philip E. Agre: Surveillance and Capture: Two Models of Privacy, in: The Information Society, 10/2 (1994), pp. 101–127. [^]
  16. Agre: Surveillance and Capture, p. 120. [^]
  17. Agre: Surveillance and Capture, p. 121. [^]
  18. For example, Manuel Castells: Materials for an Exploratory Theory of the Network Society, in: The British Journal of Sociology, 51/1 (2000), pp. 5–24. [^]
  19. Foucault: Naissance de la biopolitique, p. 122. [^]
  20. Foucault: Naissance de la biopolitique, p. 232. [^]
  21. URL: https://help.uber.com/h/f0934623-5fbc-4628-8dd0-565d5e451882. [^]
  22. Nicholas Gane: The Emergence of Neoliberalism: Thinking Through and Beyond Michel Foucault's Lectures on Biopolitics, in: Theory, Culture & Society, 31/4 (2014), pp. 3–27, here p. 20. [^]
  23. Foucault: Naissance de la biopolitique, p. 20f. [^]
  24. Michel Foucault: Les mots et les choses, Paris: Gallimard 1966. [^]
  25. Foucault: Les mots et les choses, p. 266. [^]
  26. Michel Foucault: Sécurité, Territoire, Population, Paris: Gallimard Seuil 2004. [^]
  27. Foucault: Sécurité, Territoire, Population, p. 107ff. [^]
  28. Foucault: Naissance de la biopolitique, p. 63. [^]
  29. Foucault: Naissance de la biopolitique, p. 63. [^]
  30. Foucault: Naissance de la biopolitique, p. 33f. [^]
  31. While Foucault references Alfred Marshall's work, his discussion of marginalism remains tentative. [^]
  32. Michel Callon and Fabian Muniesa: Peripheral Vision: Economic Markets as Calculative Collective Devices, in: Organization Studies, 26/8 (2005), pp. 1229–1250. [^]
  33. Bernhard Rieder: What is in PageRank? A Historical and Conceptual Investigation of a Recursive Status Index, in: Computational Culture, 2 (2012), URL: URL: http://computationalculture.net/article/what_is_in_pagerank [^]
  34. URL: https://help.uber.com/h/34212e8b-d69a-4d8a-a923-095d3075b487. [^]
  35. Gane, Emergence of Neoliberalism. [^]
  36. Friedrich A. Hayek: The Use of Knowledge in Society, in: The American Economic Review, 35/4 (1945), pp. 519–530, here p. 530. [^]
  37. Hayek: The Use of Knowledge in Society, p. 519. [^]
  38. Utpal M. Dholakia: Everyone Hates Uber's Surge Pricing – Here's How to Fix It, in: Harvard Business Review (December 21, 2015), URL: https://hbr.org/2015/12/everyone-hates-ubers-surge-pricing-heres-how-to-fix-it. [^]
  39. Gilles Deleuze: Post-scriptum sur les sociétés de contrôle, in: Pourparlers. 1972–1990, Paris: Les Éditions de Minuit 1990, pp. 240–247. [^]
  40. Gilles Deleuze: Foucault, Paris: Les Éditions de Minuit 1984. [^]
  41. "Il a fallu que la biologie saute dans la biologie moléculaire, ou que la vie dispersée se rassemble dans le code génétique. Il a fallu que le travail dispersé se rassemble ou se regroupe dans les machines de troisième espèce, cybernétiques et informatiques. Quelles seraient les forces en jeu, avec lesquelles les forces dans l'homme entreraient alors en rapport ? Ce ne serait plus l'élévation à l'infini, ni la finitude, mais un fini-illimité, en appelant ainsi toute situation de force où un nombre fini de composants donne une diversité́ pratiquement illimitée de combinaisons." Deleuze: Foucault, p. 140, author's translation. [^]
  42. While the common translation of "fini-illimité" as "unlimited finity" may be more elegant than "fini-unlimited", this amounts to a rather drastic change in emphasis. For a discussion of that notion from a different angle see Alex Galloway: Computers and the Superfold, in: Deleuze Studies, 6/4 (2012), pp. 513–528. [^]
  43. Deleuze: Foucault, p. 132. [^]
  44. Foucault: Les Mots Et Les Choses, p. 219. [^]
  45. Edgar F. Codd: A Relational Model of Data for Large Shared Data Banks, in: Communications of the ACM, 13/6 (1970), pp. 377–387. [^]
  46. For a (much) deeper examination of machine learning, see Bernhard Rieder: Scrutinizing an Algorithmic Technique. The Bayes Classifier as Interested Reading of Reality, in: Information, Communication & Society, 30/1 (2017), pp. 100–117. [^]
  47. A summary is beyond the scope of this article, but it is useful to mention the work of Alvin E. Roth and Lloyd Shapley, who received the Nobel Memorial Prize in Economic Science in 2012 for their contributions to the field of market design. [^]
  48. Foucault: Naissance de la biopolitique, p. 125f. [^]
  49. Dholakia: Everyone Hates Uber's Surge Pricing. [^]
  50. David Z. Morris: Uber Responds to Rage over Alleged 'Strikebreaking' During Immigration Protests, in: Fortune Magazine (January 29, 2017), URL: http://fortune.com/2017/01/29/uber-immigration-protests. [^]
  51. Cass R. Sunstein: #Republic: Divided Democracy in the Age of Social Media, Princeton NJ: Princeton University Press 2017. [^]

Bibliography

Agre, Philip E.. Surveillance and Capture: Two Models of Privacy. In: The Information Society, 10/2 (1994), pp. 101–127. DOI:  http://doi.org/10.1080/01972243.1994.9960162

Amoore, Louise. Data Derivatives: on the Emergence of a Security Risk Calculus for Our Times. In: Theory, Culture & Society, 28/6 (2011), pp. 24–43. DOI:  http://doi.org/10.1177/0263276411417430

Callon, Michel and Fabian Muniesa. Peripheral Vision: Economic Markets as Calculative Collective Devices. In: Organization Studies, 26/8 (2005), pp. 1229–1250. DOI:  http://doi.org/10.1177/0170840605056393

Castells, Manuel. Materials for an Exploratory Theory of the Network Society. In: The British Journal of Sociology, 51/1 (2000), pp. 5–24. DOI:  http://doi.org/10.1080/000713100358408

Cheney-Lippold, John. A New Algorithmic Identity: Soft Biopolitics and the Modulation of Control. In: Theory, Culture & Society, 28/6 (2011), pp. 164–181. DOI:  http://doi.org/10.1177/0263276411424420

Ciborra, Claudio U.. Reframing the Role of Computers in Organizations. The Transaction Costs Approach. In: Lynn Gallegos, Richard Welke and James C. Wetherbe (eds.). Proceedings of the Sixth International Conference on Information Systems. Indianapolis IN, December 1985, pp. 57–69.

Coase, Ronald H.. The Nature of the Firm. In: Economica, 4/16 (1937), pp. 386–405. DOI:  http://doi.org/10.1111/j.1468-0335.1937.tb00002.x

Codd, Edgar F.. A Relational Model of Data for Large Shared Data Banks. In: Communications of the ACM, 13/6 (1970), pp. 377–387. DOI:  http://doi.org/10.1145/362384.362685

Cukier, Kenneth and Viktor Mayer-Schoenberger. The Rise of Big Data: How It's Changing the Way We Think About the World. In: Foreign Affairs, 92/3 (2013), pp. 28–40.

Deleuze, Gilles. Foucault. Paris: Les Éditions de Minuit 1984.

Deleuze, Gilles. Post-scriptum sur les sociétés de contrôle. In: Pourparlers. 1972–1990. Paris: Les Éditions de Minuit 1990, pp. 240–247.

Dholakia, Utpal M.. Everyone Hates Uber's Surge Pricing – Here's How to Fix It. In: Harvard Business Review (December 21, 2015), URL: https://hbr.org/2015/12/everyone-hates-ubers-surge-pricing-heres-how-to-fix-it.

Foucault, Michel. Les mots et les choses. Paris: Gallimard 1966.

Foucault, Michel. Naissance de la biopolitique. Paris: Gallimard Seuil 2004.

Foucault, Michel. Sécurité, Territoire, Population. Paris: Gallimard Seuil 2004.

Galloway, Alex. Computers and the Superfold. In: Deleuze Studies, 6/4 (2012), pp. 513–528. DOI:  http://doi.org/10.3366/dls.2012.0080

Gane, Nicholas. The Emergence of Neoliberalism: Thinking Through and Beyond Michel Foucault's Lectures on Biopolitics. In: Theory, Culture & Society, 31/4 (2014), pp. 3–27. DOI:  http://doi.org/10.1177/0263276413506944

Gane, Nicholas. The Governmentalities of Neoliberalism: Panopticism, Post-Panopticism and Beyond. In: Sociological Review, 60/4 (2012), pp. 611–634. DOI:  http://doi.org/10.1111/j.1467-954X.2012.02126.x

Hayek, Friedrich A.. The Use of Knowledge in Society. In: The American Economic Review, 35/4 (1945), pp. 519–530.

Morris, David Z.. Uber Responds to Rage over Alleged 'Strikebreaking' During Immigration Protests. In: Fortune Magazine (January 29, 2017), URL: http://fortune.com/2017/01/29/uber-immigration-protests.

Rieder, Bernhard. Scrutinizing an Algorithmic Technique. The Bayes Classifier as Interested Reading of Reality. In: Information, Communication & Society, 30/1 (2017), pp. 100–117. DOI:  http://doi.org/10.1080/1369118X.2016.1181195

Rieder, Bernhard. What is in PageRank? A Historical and Conceptual Investigation of a Recursive Status Index. In: Computational Culture, 2 (2012), URL: http://computationalculture.net/article/what_is_in_pagerank.

Simondon, Gilbert. Du mode d'existence des objets techniques. Paris: Aubier 1958.

Sunstein, Cass R.. #Republic: Divided Democracy in the Age of Social Media, Princeton NJ: Princeton University Press 2017. DOI:  http://doi.org/10.1515/9781400884711

Williamson, Oliver E.. The Economics of Organization: The Transaction Costs Approach. In: American Journal of Sociology, 87/3 (1981), pp. 548–577. DOI:  http://doi.org/10.1086/227496