Skip to main content
Research

Algorithmic Decision-Making, Spectrogenic Profiling, and Hyper-Facticity in the Age of Post-Truth

Author: Richard Weiskopf (University of Innsbruck)

  • Algorithmic Decision-Making, Spectrogenic Profiling, and Hyper-Facticity in the Age of Post-Truth

    Research

    Algorithmic Decision-Making, Spectrogenic Profiling, and Hyper-Facticity in the Age of Post-Truth

    Author:

Abstract

This paper investigates algorithmic decision-making and data-driven profiling as particular ways of producing truth by which "(wo)men govern themselves and others." It starts with problematizing some of the fundamental assumptions on which algorithmic decision-making relies. It then conceptualizes profiling as a "spectrogenic process" in which abstractions are produced that haunt the world, thereby generating material effects of sorting people in/out from a distance. In the final section, the paper discusses emerging forms of governance and the modes of subjectification associated with the current condition of multiple profiling machines. Paradoxically, in the context of post-truth, these forms produce a hyper-facticity that governs by circumventing reflexivity, grounding government in computational truth, and substituting ethico-political decisions by calculations.

Keywords: algorithmic decisions, profiling, post-truth, specters, hauntology

How to Cite: Weiskopf, Richard. "Algorithmic Decision-Making, Spectrogenic Profiling, and Hyper-Facticity in the Age of Post-Truth." Le foucaldien 6, no. 1 (2020): 1–37. DOI: https://doi.org/10.16995/lefou.62 [Note: In 2022, Le foucaldien relaunched as Genealogy+Critique.]

4655 Views

1054 Downloads

2 Citations

Published on
2020-03-09

Peer Reviewed

Power is in tearing human minds to pieces and putting them together again in new shapes of your own choosing. Do you begin to see, then, what kind of world we are creating? (George Orwell)1

1. Introduction

The recent scandal around Cambridge Analytica (CA) has brought the manipulative potential of Big Data analyses as well as the practices of monetization of personal information and data to light. The discussion has been dominated by issues of privacy and the (mis)use of data and the role of Facebook and various data brokers in this context. CA is (or was) specialized on data mining, psychometrics and the development of psychological profiles that allow predicting behavior of users and provide a basis for (micro-)targeting people. According to its (now closed) own website, CA "use(s) data to change audience behavior" both in commercial and political contexts.

A theoretical basis for CA's work can be found in a paper published in 2015.2 Here, the authors attempt to show "that computer's judgments of people's personalities based on their digital footprints are more accurate and valid than judgments made by their close others or acquaintances (friend, family, spouse, colleagues)."

Based on Big Data technologies, so the argument goes, personality assessment is not only more accurate than subjective judgments, but also more accurate than conventional psychological diagnoses based on expert knowledge or standardized procedures for measuring personality traits. A high practical relevance is attributed to this enhanced capacity to know individuals:

Automated, accurate, and cheap personality assessment tools could affect society in many ways: marketing messages could be tailored to user's personalities; recruiters could better match candidates with jobs based on their personality; products and services could adjust their behavior to best match their users' characters and changing moods, and scientists could collect personality data without burdening participants with lengthy questionnaires. Furthermore, in the future, people might abandon their own psychological judgments and rely on computers when making important life decisions, such as choosing activities, career paths, or even romantic partners. It is possible that such data driven decisions will improve people's lives.3

The example of CA illustrates the trust in the power of algorithmic procedures and data mining, to produce evidence and better decisions. This is shared by theorists of Big Data. Mayer-Schönberger and Cukier, for example, argue that digitalized devices and Big Data procedures "liberate us from profiling's shortcomings" and hold the promise to make profiling "better, less discriminatory, and more individualized."4 But the CA example also illustrates what Youyou et. al. note in passing: "knowledge of people's personalities can also be used to manipulate and influence them."5 As revealed by the whistleblower Christopher Wylie, profiles can be turned into a "psychological warfare tool"6 as well as into big business. More than this, the example demonstrates the power of profiles and profiling for constructing and shaping the social world. It illustrates how profiles can circulate and have intended and unintended effects that are inherently ethical and political such as "sorting people according to their presumed economic and political value."7 Thus, the significance of profiling goes way beyond its potential to improve decisions. In fact, algorithmic decisions and data-driven profiling are part and parcel of a new regime of truth which Rouvroy has termed "data-behaviourism"8 and a new form of governmentality which she characterizes as "spectral."9

In this paper I problematize elements of this regime of truth and provide an account of its spectral character. While the critical debate on post-truth and post-facticity mostly focusses on the apparent loss of facts or the production of "bullshit," which is often used as a "catch-all word to cover misrepresentation, half-truths and outrageous lies alike,"10 I take inspiration from Foucault, who defined his "problem (as) to see how men govern (themselves and others) by the production of truth."11 From this perspective, the critical point is not that certain things or claims are false, but rather that they are accepted as true. I argue that (data-driven) profiling and algorithmic decision-making are new ways of producing truth by which "(wo)men govern themselves and others." It is thus not the loss of facticity that is at stake here, but the production of a new kind of facticity that I call hyper-facticity. As such it implies a new way of "defining the relations between the manifestation of truth and the exercise of power."12 It governs behavior by circumventing reflexivity, by grounding government in computational truth rather than ethical-political debate, and ultimately by substituting ethical-political decisions by calculations.

I start with a discussion of "algorithmic decision-making"13 in the context of Derrida's view on decisions that deserve the name. This opens the space for problematizing assumptions on which algorithmic decision-making and profiling rely. In transforming decisions into calculations, they create "ghostly demarcations"14 that discriminate and sort in/out based on abstractions and make the ethics and politics involved in these processes disappear. In the following section I look at the example of profiling, an important technology of organizing that is increasingly used for creating knowledge about individuals and populations and for informing decisions in various contexts. While traditional profiling generated knowledge about individuals or groups, based on predefined criteria and expert knowledge, "new profiling"15 increasingly generates a specific type of knowledge, based on detecting and discriminating patterns in huge data sets, using algorithmic procedures. I argue that algorithmic profiling, which is designed to create evidence/knowledge/transparency in (preemptive) decision-making, can be understood as a "spectrogenic process"16 in which abstractions are produced that haunt the world. Independently of whether profiles accurately represent individuals or groups, they circulate and have material effects. The application of profiles and profiling has a spectral structure, generating effects in various contexts, which cannot be mastered in advance. In the final section, I reflect on the emerging forms of governance and the modes of subjectification associated with the condition of multiple profiling (machines).

2. Big Data, Algorithmic Decision-Making, and the Ghost of the Undecidable

In recent years, quantification and measurement have been intensively discussed in different areas.17 Quantification has taken a new dimension with the movement of "datafication," which implies transforming the flux and flow of real-life events—behaviors, actions, motions, bodies, etc.—into a quantified format that allows it to be tabulated and analyzed. The label "Big Data" is often characterized by the "4Vs" (huge in volume, diverse in variety, high in velocity and veracity of data) and captures a development that was made possible by growing computational power and technological advancements in data collection ("harvesting"), processing, storing and "mining." The transformation of human experience into a datafied and machine-readable form is driven and intensified by the economic imperatives of "surveillance capitalism."18 Zuboff has argued that this "mutation of capitalism" relies on transforming human experience into data, providing material for the production of "prediction products," which are "designed to forecast what we will feel, think and do: now, soon, and later."19 Here, "machine intelligence"20 (a broad term referring to a wide range of computational operations, like machine learning, "classical" algorithmic production, predictive analytics or artificial intelligence and others) is the central means of production, which again relies on extracting masses of data for creating accurate behavioral predictions as a source of profit.

Predictions and Evidence-based Decisions

Today, Big Data analyses and data-driven profiling are applied in many different areas. In general, "(Big) data captured through digitalized devices are processed by algorithms aimed at predicting what a person will do, think and like on the basis of their current (or past) behaviors."21 Such algorithms are for example used for predicting security threats and risks, forecasting crimes in predictive policing, in insurance and employment screening, or in establishing credit scores for predicting the likelihood of loan default. In managerial contexts, data and numbers are seen as an essential basis for organizational steering and transparency. Big Data analyses are used in the areas of marketing and consumer research. Increasingly, psychometrical categories are used to create clusters of consumer types (profiles) and to predict consumer behavior.22 The idea of "predicting people" has also become central in Human Capital Management.23

In all of these areas, Big Data analyses promise to deliver a basis for "evidence-based decisions." Data are frequently presented as neutral forms of information that represent objects in the world and provide insight into social, economic and environmental phenomena. The assumption is that (big) data comprehensively map reality, while algorithmic procedures (computer-based data mining) allow identifying and "discovering" the underlying structures and patterns of this reality. McAfee and Brynjolfsson formulated the conviction of Big Data proponents succinctly: "Data-driven decisions are better decisions—it's as simple as that. Using big data enables managers to decide on the basis of evidence rather than intuition."24 Critical researchers in various fields instead have questioned this assumption on the grounds that (big) data and algorithms might be inherently biased, creating, reproducing or even intensifying unfair discriminations and established inequalities.25 In addition, Kitchin reminds us that (big) data are neither simply given nor representing the world. Instead, "data are captured from the world, but in turn do work in the world."26

Algorithmic Decisions, Algorithms, and the Ghost of the Undecidable

For Derrida, a decision that deserves the name presupposes "undecidability" and requires going through the "ordeal of the undecidable." Algorithmic decisions are in his sense not decisions at all since "a decision that didn't go through the ordeal of the undecidable […] would only be a programmable application or unfolding of a calculable process."27 So called decision trees that characterise algorithmic decisions effectively automate responses and force a yes/no or if/then choice. In this sense, the agonism and radical uncertainty of a true decision is annulled or rather denied. There is no hesitation or fear, no struggling heart, no ambiguity. Similarly, an "apparent decision taken on the basis of what is 'seen' evidently, via the calculation of experts, or in the screened results of algorithmic visualization, is not a decision at all."28 In fact, algorithmic procedures that produce a decision (or rather a yes/no switch) as the outcome of a calculated series of steps (e.g. decision trees) reframe ethical and political questions as technical problems of systems engineering. Operating through logical and mathematical procedures, they become "supercarriers of formal rationality"29 that promise to overcome the "bounded rationality" of humans. While proponents of algorithmic decision-making assume that the inherent undecidability, which traverses human systems, can be—or ideally is—removed, Derrida reminds us that the undecidability is "not a moment to be traversed and overcome." It continues to inhabit the decision "as an essential ghost."30 The programmed decision and the objects, divisions and demarcations it creates are both haunting the world and haunted by its own shadow. In the context of data-driven decisions, the ghost(ly) and the notion of haunting draw attention to all that exceeds measurement and meaning, to the "noise" in the system, to all that cannot be captured and computed, to that which occasionally and unexpectedly interrupts systems and sometimes also allows for novelty, creativity and the creation of the new.31 As Cheyney-Lippold has demonstrated, even "algorithmic identities"32 inferred upon individuals can never finally remove the undecidability and fix the subjects they refer to. On the contrary, they are pervaded by undecidability. For example, the ratio male/female or other characterizations is constantly changing according to the feed-back data; definitions of maleness/femaleness etc. can shift according to the logic of the algorithm.

Algorithms can be understood as "series of generalized procedures for turning disorganized data inputs into manageable outputs through a series of logical rules."33 As such, algorithms are organizers that are both socially produced and socially productive. They organize and structure the reality producing process as they select, filter, and frame information and create facticity. Hence, no algorithm is neutral. "Pattern discrimination,"34 the ability to filter information from data is never innocent. It is inherently tied to value judgments and it often reproduces or amplifies the very phenomenon it seeks to describe or represent. Values are thus not external to the algorithm, but they are folded into it. Even the (seemingly) trivial acts of naming, of storing data, of "deciding where to make cuts in the system,"35 contain value judgements. Moreover, algorithms that are "inherently framed and shaped by all kinds of decisions, politics and ideology" are created for purposes that "are often far from neutral: to create value and capital; to nudge behaviour and structure preferences in a certain way; and to identify, sort, classify people."36 Algorithms codify assumptions and are socially productive. A data-mining algorithm for example that is designed to find or "discover" valuable data and patterns, sorts, selects, filters data based on specific assumptions about what makes data "valuable" and for whom. In Hacking's terms, it is not just an "engine of discovery"—as the term "Knowledge Discovery in Data Base" (KDD) suggests—but also at the same time an "engine of making up people."37 Algorithms are thus performative. They generate objects of knowledge that loop back and shape the world. There are many different—highly complex and sophisticated—technical methods of data mining (e.g. decision trees, cluster analysis, neural nets, text mining, anomaly detection and many others). What is distinctive about these methods is the specific interested perspective that guides the search for "valuable" data and patterns.38 From these data it allows to construct a "person of interest," for different actors, institutions, organizations for different reasons. A hiring algorithm for example mines data from the interested perspective of an employer and creates an image of the attractive/unattractive candidate; a security algorithm makes up an image of the potential security risk; in consumer profiling, types of more or less profitable consumers are created and sorted; in credit scoring an image of the more or less creditworthy or trustworthy person is produced. While it is often assumed that the outcomes represent a preexisting and underlying reality in a neutral way, critical researchers have argued that the outcomes are rather "carefully crafted fictions"39 that help to make us overlook or forget that underlying data are often messy, incomplete, full of gaps and errors, as well as the fact that algorithms may work in biased or even systematically distorted ways. However, such fictions are not simply illusions, but rather social objects and artefacts that have concrete material effects in the world. This will be theorised and illustrated in the following section on profiling.

3. Profiles, Data-doubles, and the Spectrogenic Process

Profiling is a technology and practice that is increasingly "used to make decisions, sometimes even without human intervention."40 Profiles and profiling have a long history. Genealogically, they can be traced back to their use in police work and behaviorist psychology of the early 20th century.41 Harcourt has shown how during the second half of the 20th century, actuarial methods have been increasingly used in the context of criminal justice, creating profiles from statistics.42 Koopman's genealogy of the "informational person" points to various systems of "human bookkeeping" (registration systems that define "who the person is"), personality testing, psychometrics and trait measuring, as well as to (racialized) credit scoring systems in the early 20th century as precursors of modern profiling.43 Thus, long before the emergence of Big Data, profiles were used as a knowledge tool in a wide range of human sciences. Today, it is no longer only about deviant subjects in the psychological, legal or moral sense, but above all about the identification of (potentially) "valuable" subjects. In modern marketing for example profiles are created for classifying customers according to their potential worth. Equally, in Human Resource Management (HRM), profiling of employees and classifying them according to their performance and their "potential to perform" has a long tradition. Identifying "high potentials" and future "stars" and separating them from "question marks," "cash-cows," and "deadwood" has been proposed as a tool for supporting decisions in "strategic HRM."44 Increasingly, profiling of prospective and/or current employees draws on online data and algorithmic procedures to generate knowledge concerning the "fitness for the job."45 Similarly, in the context of marketing and consumer profiling, researchers observed a shift from classifying and segmenting customers according to demographic criteria to the use of psychometric profiling for the purpose of matching a product, price or marketing strategy to particular individuals or target groups.46 In all of these areas it is assumed that decision-making dramatically improves, since it is "based on highly informed prediction of future behavior."47

Theorizing Profiles and Profiling

All profiles are abstractions. In the process of profiling images of the person are created for the purpose of diagnosis or prediction. A profile is never a representation of an individual but a selective image that is constructed for specific purposes. In the process of profiling the complexity of the person is reduced to a finite number of traits, indicators, etc. Profiles are in this sense "model(s) or figure(s) that organize […] multiple sources of information to scan for matching or exceptional cases."48 Bogard locates these "models or figures" between fiction and reality at the intersection of virtual and actual worlds. Such figures may be fictions (in the sense of not representing an underlying reality) but these are "a fiction difficult to separate from fact, whose effects are indiscernibly real."49 Fictions create mental images and expectations. They are operationally effective, as they influence decisions, shape and intervene in the world. Consequently, profiles should not be understood as descriptive, but as performative.

Harding has taken a similar perspective and argued that profiles that are assembled and derived from data should not be understood as representations of the self, but as "mere specters of our actual selves—something metaphorically akin to the theatrical illusion called Pepper's Ghost."50 Understanding profiles as "mere specters of our actual selves" reminds us that they should not be confused with the embodied human being, yet at the same time, they are more than simply illusions. I argue that such categories may not be able to capture the world, but nevertheless they may have material effects in the world. In her seminal book on "ghostly matters," Gordon51 has introduced the "ghost" as a social figure that helps understanding and theorizing phenomena that are somehow located between the dead and the living, between the abstract and the concrete; phenomena which are absent, but nevertheless make their presence felt, phenomena which are illusionary or unreal, but are at the same time real in the sense of producing material effects. The figure of the "ghost" and the "ghostly" has also been taken up by organization scholars52 and data-scientists53 for theorizing a "spectral working" of forces that destabilize and contaminate any self-sufficient present, for disrupting linear concepts of time and for attending to how policies or data become "haunted" when their implementation disrupts local cultures, traditions and histories. Continuing this work and taking some inspiration from Derrida's analysis of the ghost and the "ghost effect" in The Specters of Marx, I suggest that profiles can be understood as ghosts that are the outcome of a "spectrogenic process."54 Derrida describes this as a process in which thoughts, ideas, data, etc. are extracted from the living body and integrated into a more abstract body. "Ghost effects" are produced with every step of abstraction. Transferred to our context this means, that ghost effects are engendered on multiple levels. Firstly, when human experience is abstracted and turned into "information," secondly, when data are extracted, resulting in what Haggerty and Ericson called a "decorporealized body, a 'data double' of pure virtuality."55 The data-double as the aggregation of personal information represents already a high level of abstraction; an even higher level is reached when the data-double is "mined" and selective information is assembled and reassembled into a profile that allows institutions and organizations to make discriminations and separations among people. In the process of application, institutionally and technically formed and shaped data return to the world of real-life events, where they haunt those with whom profiles are associated. As Gordon put it: "[i]n haunting, organized forces and systemic structures that appear removed from us make their impact felt in everyday life in a way that confounds the social separations themselves."56

While all profiles are abstractions, profiling can be distinguished first, by the way how these abstractions are produced and second by the way how these abstractions (profiles) are applied in decision-making processes. I elaborate this in the following.

Production of Profiles

There are different forms of profiling. Traditional profiling is a process of classification, which is based on ex-ante categorizations of individuals. It assumes that the individual can be characterized by a set of traits that can be identified by professional/scientific methods. A specific type of knowledge is created that typically results from experience or the tradition of expert knowledge. Typically, such knowledge is deductive and derived from specific predefined hypotheses. In the traditional sense, the profile results from targeted questioning based on specific hypotheses. Profiling involves the definition of categories that are assumed to characterize the individual and the establishment of methods for measuring the individual along these categories. Measurement allows to identify differences and to determine the distance to the norm. Profiling allows to position individuals in a normative matrix of behavior and to make them comparable to others.

Traditional profiling resembles the classical disciplinary technology of examination. This "tiny operational schema that has become so widespread (from psychiatry to pedagogy, from the diagnosis of diseases to the hiring of labour)," which Foucault saw at the heart of the procedures of discipline, "establishes over individuals a visibility through which one differentiates them and judges them."57 It is the scientific or quasi-scientific fixing of differences and their representation in numbers, charts, and reports, which provides the basis for administrative decision-making.

As an administrative technology, described by Foucault as a "scientific or administrative inquisition which determines who one is,"58 classical profiling works as a productive technology of power. It creates objects of knowledge, allows classifying and ranking these objects, and turns individuals into "cases" that can be handled and managed in particular ways. Emerging profiles are inherently normative constructions that have intrinsic moral value since they are "kinds that people want to be or not."59 Like stereotyping, profiling works as an a priori ordering that tells us who "is" good/bad, trustworthy/untrustworthy, healthy/unhealthy, suspicious/un-suspicious, profitable/not-profitable, etc. Profiles also provide those who are profiled with an image of themselves. They allow subjects to reflect on themselves and to monitor their behavior in the light of normative expectations. In this sense, profiles create an imaginary ideal ("idéal speculative") that works as a subjectifying force.60

In "new profiling,"61 profiles are often the result of automated and algorithmically enhanced systems of pattern recognition and -discrimination.62 This presupposes the transformation of the flux, flow, and mess of real-life events into data that are readable by machines. Knowledge in this case is inductive. It is shaped by the algorithms, which direct attention, focus on specific points in the data set and cancel out other data. Profiles in this sense are less about measurement but about "detection." In contrast to traditional profiling, categorization emerges ex-post, from clustering detected patterns. As Hildebrandt explains, "Pattern recognition, based on 'blind' correlations (i.e., correlations that do not derive form predefined hypotheses and do not necessarily imply causes or reasons) allows those that use the ensuing profiles to anticipate the state or behavior of the objects or subjects that are profiled."63

Traditional profiling follows the logic of inspection. It seeks to identify the presence of certain predefined characteristics in the individual and categorize it accordingly. In contrast, new profiling follows the forward-looking logic of prospection. It is based on the assumption that detected patterns and correlations allow creating "predictive transparency,"64 which provides the basis for preemptive decision-making and anticipatory governing. Predictions are derived from patterns in past behavior or they are derived from similar patterns of "groups" or "neighbors." Categorizations thus not only depend on individual actions, behaviors and histories, but on those of others who are similar to him or her.65 The generated knowledge is based on probability rather than on the construction of causalities. In short, while traditional profiling seeks to determine, who or what the person is, new profiling "render(s) us transparent in a rather counterintuitive manner. We become transparent in the sense that the profiling software looks straight through us to 'what we are like,' instead of 'what or who we are.'"66

New profiling, based on pattern recognition, constructs or assembles a person of interest by connecting data-points and visualizes the person for example on a screen, a scoring sheet or a risk map. Algorithms function as means for directing attention and for focusing on specific data while neglecting or cancelling out all other data. In the case of new profiling, they make up people in the form of "data doubles" or "data derivatives."67 Such objects of knowledge can be generated from a variety of different and heterogeneous sources (e.g. discrete digital acts like searching, messaging, blogging, purchasing, liking, tweeting, and posting, etc.). Profiles are composed by combining these actions and connecting bits and pieces of data. They are not representations of the 'real' person or of 'individuals,' but selective constructions which consist themselves out of sub-individual elements, the fragments of registered behavior, which are extracted from the flow of data for specific purposes. They resemble the coded "dividual"68 which, according to Deleuze, replaces the undivided in-dividual in the "society of control." The dividual consists of separate data-points, which can be scrutinized, calculated, circulated, assembled, and reassembled for specific purposes. As Hildebrandt explains, "these elements or properties are not given; they are attributed by whoever writes the algorithms of data analytics; taking into account that whatever dividuals are sought after they must be inferred from machine-readable data by machine-readable algorithms."69

The aim is not to produce exhaustive knowledge about a particular individual and his or her intrinsic characteristics, but to create a basis for acting on similar individuals. These profiles are not a reflection of a given identity, but a projection of possible future behavior. Amoore stresses that the digital alter ego that emerges from algorithmic profiling "is precisely a projected person: an image of a potential future person as yet to come,"70 e.g., the potential terrorist, the potential criminal, the potential buyer, the potential toxic employee, the potential credit risk, etc.

While traditional profiling fixes the individual in a relatively stable normative matrix, profiles that emerge from clustering patterns "create momentary groupings that might disappear into the white noise of the database in the next moment."71 They therefore do not constitute a stable and fixed norm that defines and disciplines bodies according to a predefined model, but rather create a fluctuating network of categories that modulate behavior. In terms of Deleuze, such "modulation" functions "like a self-transmuting molding continually changing from one moment to the next, or like a sieve whose mesh varies form one point to another."72 In such dynamic networks, subjects can be assessed and sorted for specific purposes on a continuous basis. Categorizations are themselves evolving and shifting depending on feed-back data and the logic embedded in the algorithm.

What is called "autonomic profiling"73 goes one step further. Here, algorithmic functions are not only applied for the purpose of detecting patterns and generating profiles, but also for adapting the algorithms to changing environments and for automatic sorting and treatment of profiles, for example in the form of a differentiated or "(micro)targeted" response that is machine-generated.

An organizational example that comes close to autonomic profiling can be seen in concepts of "people analytics."74 The individual here is a "walking data generator."75 Sensor technologies automatically extract data from moving bodies. The algorithm built into the technology mines the data and autonomously transforms extracted data into a virtual profile, which is continuously updated and modified. People analytics visualizes individuals and teams and makes them comparable, measurable, and manageable. In this version, data-traces left by the moving bodies are calibrated as "derivatives." Amoore has described the specific form of abstraction that characterizes this form of creating profiles: "The data derivative comes into being from an amalgam of disaggregated data—reaggregated via mobile algorithm-based association rules and visualized in 'real time' as risk map, score or colour coded flag."76 In Table 1 the main differences are summarized.

Table 1

Traditional and Data-driven Profiling.

Traditional/Disciplinary 'New profiling'/Data-driven
• Deductive • Inductive
• Causality • Correlation
• Inspection • Prospection
• Fixing of difference • Momentary patterns
• Relatively stable matrix of identities • Web of fluctuating categories
• Examination • Continuous assessment
• Administrative decision-making • Preemptive decision-making

In sum: In particular data-driven profiling based on (algorithmic) pattern recognition illustrates how emerging profiles are increasingly decoupled from the social relations from which they are drawn. Both, the process of arriving at categories and the multiple decisions about what becomes visible or invisible, what to select and what to exclude within the system become black-boxed and disappear in the created image. Such profiles emerge as "artefactual bodies" that can be and are disconnected from the individual. They can be combined with other profiles and they can be used for different purposes in different contexts. They can be continuously modified and they are extremely mobile, particularly in their electronic form. Thus, profiles become "prediction products"77 that circulate in cyberspace, can be sold and traded and turned into capital.

Application of Profiles

As pointed out, profiles are not descriptive but performative. They are not only a product of worldly and material processes, but in turn they do work in the world where they have material effects. They generate a frame for decision makers, produce the "evidence" to which decision-makers can refer to, they instruct decision-makers on what to see and what to ignore and they "act back on those with whom data are associated, informing us who we are, what we should desire or hope for, including who we should become."78 Application is the process in which profiles perform their work of informing or shaping decision-making in various places. Application is often seen as a technical, linear process in which context-specific 'solutions' or decisions are derived from abstract rules or concepts, or in our case, from profiles. By contrast, Derrida has suggested "a new concept of application which is in agreement with dissemination" and implies "a total spectral structure."79 Such a concept of application leads beyond the mere technical or instrumentalist understandings and considers application as a creative and performative process, which generates something unpredictable in "contexts which nobody can master in advance."80

In this process, profiles travel through time and space and have expected and unexpected effects. As ghosts, they are not bound by time and space. They can be created now, but return later; they can be produced here, but return elsewhere. Similarly, Amoore reminds us:

Data are not only gathered in the strict sense of data collection, but they assemble themselves with multiple other elements and things. Data do unexpected things with unanticipated effects. Stop the wrong person at the border, fail to stop the right person, gather false positives and then let them loose into the world, invite some intuitions and inferences from observers and banish others.81

As data and profiles move from context to context, they can be interpreted and framed in different ways. Profiles can mean different things in different contexts. A regular buyer of antidepressant drugs can be an attractive customer for the pharmaceutical industry, but can be classified as an employment risk in other cases; a credit score can become a proxy for the trustworthiness of persons etc. Even though, any interpretation or sensemaking is context-bound, context cannot be fixed. Every profile is a sign or a mark that signifies some social status—creditworthiness, employability, riskiness, profitability, etc. As such, it can be removed from its "original" context. It can break from a given context and can be redeployed in new contexts; data collected for one specific purpose can be subsequently used for another authorized or unauthorized purpose, a phenomenon that has been described as "function creep."82 As is frequently demonstrated, profiles can easily flow between so-called "data sharing partnerships" that are created for optimizing the user experience. In moving from context to context profiles do not move in a vacuum, but they can be, and actively are transferred and transformed in the context of other profiles, actors, traditions, and institutions. The de/re-contextualization of profiles may change not only the content (the data-body) but also the ascribed meaning of the profile. Thus, when profiles travel form context to context, they may also be reframed according to specific interests and ideologies. Profiles are scrutinized, assembled, and reassembled for specific purposes in a variety of distributed "centres of calculation,"83 which—like Cambridge Analytica—seek to develop a knowledge base for commercial or political strategies for influencing and shaping behavior of customers or voters or, more generally, for nudging people to make decisions, which decision architects consider good decisions.

In the context of "surveillance capitalism,"84 profiles are the form of capitalizing data. Digital traces are mined and transformed into profiles that can be bought and sold, packaged and repackaged for specific purposes. Since every piece of data is of potential value for someone it might end up in some profile that is useful and can be sold to somebody. The monetization is the business of various data brokers, direct-marketers and other organizations and companies that sell and resell profiles to interested parties.85 In these conditions the emerging profiles are further removed from their primary source or context. As the legal scholar Frank Pasquale says, there are endless permutations:

Profiling might begin with the original collectors of information, but it can be elaborated by numerous data brokers, including credit bureaus, analytic firms, catalog co-ops, direct marketers, list brokers, affiliates, and others. Brokers combine, swap, and recombine the data they acquire into new profiles, which they then can sell back to the original collectors or to other firms.86

What Pasquale has called "runaway profiles" illustrates the spectral logic of profiling, where data are decoupled and torn loose from their context in ways that make it impossible to understand how data from various sources are transformed into a profile, on what kinds of decisions, assumptions and prerogatives the constructed object has been formed, how it has been transformed and modified according to technical, economic or political considerations. While the decision process in itself becomes obscure and black-boxed, profiling and profiles—particularly when produced by "data-intelligence"—produce and reproduce the "illusions of transparency,"87 the promises of predictability and governability that support a (neoliberal) regime of truth and justify decisions and demarcations on the basis of seemingly rational and neutral calculations.

Profiles as Ghostly Demarcations

When abstractions circulate and are materialized (i.e., when the profiles are 'applied') they have powerful effects. They serve to allow or to deny access to services and benefits, they generate credibility or suspicion, and they provide the basis for a differential treatment of people. In particular, runaway profiles "can lead to cascading disadvantages as digital alchemy creates new analogous realities."88 Once a person is attributed the label of credit risk, toxic worker, marginal consumer, potential dropout or high-risk traveler, that attribute may appear with decisional force on various occasions throughout the social field. Profiles—regardless of whether we are talking about traditional (disciplinary) or about data-driven profiling, based on pattern recognition, mining of data, etc.—are discriminatory technologies that work as "social sorting."89 Profiling sorts people into social categories, so that the classification allows distributing opportunities and risks. While algorithmic and data-driven profiling is promoted as means to overcome human irrationality, to optimize judgmental accuracy and to produce more fine grained and more objective analyses, it is perhaps better understood as a movement in which mechanisms that generate demarcations become increasingly opaque and incomprehensible for those who are objects of profiling.

In the context of security governance, Leese has identified a triple "loss" that is associated with data-driven profiling (in comparison to traditional profiling).90 First, there is a "loss of traceability." With data-driven profiling, it is increasingly difficult to determine where data from which profiles are assembled come from, since they can literally come from everywhere. It is increasingly difficult to know and understand how categories are defined and how classifications are produced. Second, there is a "loss of visibility." Profiling increasingly operates out of sight. Data-mining procedures produce "artificial categories," the individual is likely not even to notice when or how he or she becomes categorized as risky, profitable, suspicious, etc. Consequently, most of the time people will not even notice if or how they have been discriminated against. Third, there is a "loss of accountability." In many cases, the human decision-maker is "rendered likely to comply with the truth claims of the algorithm"91 even if "autonomy" is attributed to him or her. The ability of individuals to reflect or challenge decisions vanishes to the degree that the process of making them becomes impenetrable. An inherent opacity characterizes predictive algorithms, particularly in the case of machine learning.92 Even experts are often overwhelmed by the complexity to comprehend the internal procedures of algorithmic systems. In many cases, it is difficult, if not impossible, to give an account of what the profile means and how it has been constructed.

I would add that there is—paradoxically—also a loss of predictability. While data-driven profiling seeks to make actions, thoughts, feeling, and behavior of citizens (customers, voters, debtors, employees, etc.) predictable, it becomes highly unpredictable how, when and for what reason created profiles will affect those who are profiled. Occasionally, we realize that we are in the visor of specific organizations and are targets of specific profiling machines. This happens, for example, when we are sent an individualized or personalized message, when we are offered a tailored product, are singled out as a risky passenger at the airport, or, when we are denied access to public spaces, insurance, employment, credit etc. because we (or our profile) does not fit to the criteria defined in advance by anonymous agencies. Moreover, there is a loss of response-ability. The ability to respond to the concrete other is a precondition for exercising or enacting moral responsibility. In his analysis of the Holocaust Bauman93 has analyzed how bureaucratic procedures and abstract classifications work as "moral sleeping pills." According to Bauman, dehumanization starts when people are objectified and reduced to a set of quantitative measures. In data-driven profiling where people and social relations are transformed into bits of information and electronic impulses, dehumanization works on a more abstract level. When the person appears on screen as a result of algorithmic visualization, the Other as a "face" (in Levinas' sense of "an authority without force"94 that demands a moral response) disappears completely and moral responsibility for the Other is suspended and rendered ineffective.

To sum up: Profiling increasingly creates an "invisible visibility" that is associated with what Derrida has called the "visor effect: we do not see who looks at us."95 We are observed and increasingly denied the possibility to understand, question, or challenge how, why, when, by whom we are observed and evaluated by what criteria. This also has consequences for the modes of subjectification and the specific ways in which truth and power are linked to each other.

4. Multiple Profiling(machines): Proto-paranoiac Subjectification and Hyper-facticity

Today, profiles and profiling are used in many different contexts: customer profiling, profiling for employment screening, credit scoring, criminal investigations, immigration policy, healthcare management, forensic biometrics, etc. The emerging arrangement of multiple profiles and "profiling machines"96 that create specific forms of visibility of subjects modifies the "play of light and darkness" which Foucault has related to the transparency ideal embodied in the panopticon.97 Hierarchical observation and normalizing judgement characterize its diagram. Activities are captured in reports and registers which provide the basis for comparisons and judgements. The context of multiple profiling machines displays a new diagram, in which the hierarchical and fixing panoptic gaze, which oversees the field from above, is replaced by a multiplicity of gazes, coming from a multiplicity of observing agencies, which in addition are movable, and continuously changing their position. This is the condition emerging from multiple profiling machines, in which subjects appear in the light of various and changing searchlights. In the contemporary "digital prism"98 of the datafied world, each of these searchlights constitutes subjects as persons of interest for specific purposes. While the panoptic gaze fixes individuals and objectifies them by pigeonholing the observed into predefined boxes (categories), "new profiling" generates observational categories from below, by mining and detecting patterns in data traces and footprints. Kitchin has noted that these footprints and shadows are "dispersed, divided across dozens of organizations and servers, and are subject to integration and division. […] At best they constitute oligopticons—limited views form partial vantage points from fixed positions with defined view sheds."99 Ellerbrok has described a similar, but perhaps more dynamic scenario of "multiple variable visibilities"100 that is generated by new media technologies that render individuals visible to diverse interests simultaneously, on multiple levels that are interacting dynamically.

The multiplication of profiling (machines) and the proliferation of data-driven decision making fundamentally affects processes of subjectification and the way the truth-game is linked to the "art of governing."

Subjectification

Rouvroy has argued that algorithmic forms of profiling affect people in so many different contexts, yet without addressing them as persons directly. In her view, data-driven profiling circumvents subjectivity and subjectification, and in some sense even works "without subjectivity."101 Algorithmic governmentality, she says, "bypasses consciousness and reflexivity, and operates on the mode of alerts and reflexes."102 Yet it can also be argued that the emerging regime of truth creates conditions that are related to a specific mode of subjectification which Žižek has called "proto-paranoiac":

If, then, today, in the guise of detailed databases [profiles, R.W.] that circulate in the corporate cyberspace […] we are […] 'interpellated' by institutions even without being aware of it, one should nevertheless insist that this 'objective interpellation' actually affects my subjectivity only by means of the fact that I myself am well aware of how, outside the grasp of my knowledge, databases circulate which determine my symbolic identity in the eyes of the social 'big Other'. My very awareness of the fact that 'the truth is out there', that files on me circulate which, even if they are factually 'inaccurate', none the less performatively determine my socio-symbolic status, is what gives rise to the specific proto-paranoiac mode of subjectification characteristic of today's subject: it constitutes me as a subject inherently related to and hassled by an elusive piece of database in which, beyond my reach, 'my fate is writ large'.103

But the conditions of subjectification are even more paradox. On the one hand there is the phenomenon that Žižek describes. On the other hand, these data practices are widely regarded as normal, necessary or beneficial. There are a lot of incentives for making ourselves visible to various others, or to (actively) profile ourselves. In order to proof our creditworthiness or attractiveness to employers or potential partners, for example we provide our data and actively create our profiles. This is a new form of truth obligation that modifies the confession which according to Foucault "became one of the West’s most highly valued techniques for producing the truth,"104 which constitutes subjects in the double sense of the word. In the digital context one confesses in the form of providing data. Often this is not forced, but happens because of ignorance, laziness or indifference. The promise here is not salvation in the other world, but the promise is to get access to relevant services (e.g. credit, employment, friends, etc.) and pleasures in this world. In the digital context, the obligatory act of speech turns in many cases into a desiring self-exposure and exhibition of intimate details.105

Governing according to the rules of evidence

Data-driven profiling provides the basis for a new mode of governing or, in Foucault's sense, a new "way of linking the art of governing and the game of truth."106 It is a way of linking them, which at first sight seems paradox:

It is the idea that, if in actual fact the government governs not through wisdom in general, but through the truth, that is to say through the exact knowledge of the processes that characterize the reality [, …] then it will have to govern even less. The more it pegs its action to the truth, the less it will have to take decisions that have to be imposed from above, in accordance with more or less uncertain calculations on people who will accept them more or less well.107

Foucault traces this way of linking the art of governing and truth to the (physiocratic) idea, of "govern(ing) according to the rules of evidence." According to this idea, it would ideally be "things in themselves, rather than men, that govern." This, of course implies a utopian moment in which the exercise of power is nothing more than an "indicator of the truth."108

Today, in the context of the regime of practices which Zuboff109 has called "surveillance capitalism," a similar utopian moment is present in the work of prophets of the new data-driven society, who believe that total knowledge is within reach. They argue that "Big data give us a chance to view society in all its complexity, through the millions of networks of person-to-person exchanges […] we could potentially arrive at a true understanding of how society works and take steps to fix our problems."110 "Evidence" (i.e., truth) is derived from a multiplicity of data (Big data), drawn from a huge variety of sources and brought into a specific form by algorithms. It is thus, in the terms of Foucault, a specific form of evidence that is linked to a specific "modality of veridiction."111 Evidence here is not produced by or attributed to personal experience or courageous truth-telling (the parrhesiastic modality), it is also not produced with reference to some general form of knowledge (the modality of wisdom) nor by referring to professional expertise and (causal) knowledge (the technical modality). Rather, evidence is associated with the "objectivity" of data and the effectiveness of computational procedures. In this context, there is a multiplicity of generators of evidence, which provide the basis for a form of governance that is based on the assumption that "computational truth must necessarily replace politics as the basis of instrumentarian governance."112 In the context of ubiquitous computing, the idea that things in themselves govern takes on a new form that goes beyond Foucault's analyses. The more the things that surround us become "intelligent"—from cars to refrigerators, from cellphones to Alexa, etc. the more, Zuboff remarks, the "things that we have" are turned into "things that have us."113 Increasingly, driven by the utopia of "N = all" the whole spectrum of our experience, our homes and our bodies, our movements and our emotions, etc. are transformed into objects that can be calculated, processed and managed in the name of profit. Equipped with what Zuboff calls "actuation capability," technologies like smart sensors are interlinked and can register and analyze any kind of human behavior and then actually figure out how to change it. Automatically generated impulses and "nudges" condition and incentivize behavior and bypass human reflexivity. Governing the conduct of individuals, influencing, shaping, forming conduct may ultimately become automatized and freed from the incertitude and strain of interpretation, negotiation, and responsibility. This is ultimately hyper-facticity at work.

5. Conclusion

Rouvroy speaks of an emerging "algorithmic governmentality," which allows the exercise of a specific form of power based on knowledge generated by data-driven profiling. She characterized this specific regime of power as "spectral."114 In this paper I have added Derrida's analysis of the ghost and the "spectrogenic process" as a supplement for illuminating the spectral character of this form of governmentality and for understanding the effects of power of algorithmic profiling in particular. I have shown how algorithmic profiling creates "ghostly demarcations" that haunt our present condition. The specific modus operandi of profiling signifies a shift in exercising power. While classical profiling follows the disciplinary logic of positing an ideal model and defining a norm against which reality is measured and corrected, new profiling follows the logic of security,115 which is based on the calculation of probabilities. The norm is derived from the observation of the real, using statistical methods and data-processing systems, for generating a dynamic and fluctuating network of categories that organize observation from various interested perspectives. Power in this sense is neither prohibitive (like the law) nor prescriptive (like discipline). It rather has the character of anticipatory regulation and preemption. In Foucault's words, it is a matter of "allowing circulations to take place, of controlling them, sifting the good and the bad, ensuring that things are always in movement, constantly moving around, continually going from one point to another, but in such a way that the inherent dangers of this circulation are cancelled out."116

In the introduction I have mentioned the example of Cambridge Analytica. Such companies have built their business model on optimizing the flow of populations and on developing and selling predictive profiles that allow to "sift the good and the bad" and to influence and even produce specific forms of behavior and subjectivity behavior ("We use data to change audience behavior"). In the final statement at the end of a YouTube Video, which shows Alexander Nix, the former CEO of CA celebrating the range of possibilities and promoting methods for influencing and shaping the "target audience," Nix insinuates that it was one candidate in particular who made use of these methods in the 2016 US election campaign. Retrospectively, it is remarkable, that it was exactly the one candidate that has become the epitome and symbolic figure of post-facticity and post-truth: Donald Trump. It seems that post-facticity—the syndrome of lies, bullshit, simulated truth-telling and "alternative facts"—and hyper-facticity go very well together. Perhaps, post-facticity is even the ghost that haunts hyper-facticity.

Notes

  1. George Orwell, Nineteen Eighty Four (Essex: Heinemann, 1949/1990), 206. [^]
  2. Wu Youyou, Michal Kosinski, and David Stillwell, "Computer-Based Personality Judgments Are More Accurate Than Those Made by Humans," PNAS 112, no. 4 (2015): 1036–39. [^]
  3. Youyou, Kosinski, and Stillwell, "Computer-Based Personality Judgments Are More Accurate Than Those Made by Humans," 1039. [^]
  4. Viktor Mayer-Schönberger and Kenneth Cukier, Big Data. A Revolution That Will Transform How We Live, Work and Think (London: John Murray, 2013), 161. [^]
  5. Youyou, Kosinski, and Stillwell, "Computer-Based Personality Judgments Are More Accurate Than Those Made by Humans," 1039. [^]
  6. Carol Cadwalladr, "The Cambridge Analytica Files. 'I Made Steve Bannon's Psychological Warfare Tool': Meet the Data War Whistleblower," The Guardian, March 18, 2018, https://www.theguardian.com/news/2018/mar/17/data-war-whistleblower-christopher-wylie-faceook-nix-bannon-trump. [^]
  7. Oscar H. Gandy, The Panoptic Sort: A Political Economy of Personal Information. Critical Studies in Communication and in the Cultural Industries (Bolder: Westview Press, 1993). [^]
  8. Antoinette Rouvroy, "The End(s) of Critique. Data-Behaviourism Vs. Due-Process," in Privacy, Due Process and the Computational Turn. Philosophers of Law Meet Philosophers of Technology, ed. Marielle Hildebrandt and Katja de Vries (London: Routledge, 2013), 146. [^]
  9. Rouvroy, "The End(s) of Critique. Data-Behaviourism Vs. Due-Process," 152. [^]
  10. James Ball, Post-Truth. How Bullshit Conquered the World (London: Biteback, 2017), 5. [^]
  11. Michel Foucault, "Questions of Method," in The Foucault Effect, ed. Graham Burchell, Colin Gordon, and Peter Miller (London: Harvester Wheatsheaf, 1991), 79. [^]
  12. Michel Foucault, On the Government of the Living. Lectures at the Collège the France 19791980 and Oedipal Knowledge (New York: Picador, 2016), 15. [^]
  13. Dirk Lindebaum, Mikko Vesa, and Frank den Hond, "Insights from 'The Machine Stops' to Better Understand Rational Assumptions in Algorithmic Decision Making and Its Implications for Organizations," Academy of Management Review, no. 1 (January 2020): 247–263; Sue Newell and Marco Marabelli, "Strategic Opportunities (and Challenges) of Algorithmic Decision-Making: A Call for Action on the Long-Term Societal Effects of 'Datafication'," Journal of Strategic Information Systems 24 (2015). [^]
  14. Michael Sprinker, ed. Ghostly Demarcations. A Symposium on Jacques Derrida's Specters of Marx (London: Verso, 1999). [^]
  15. Mireille Hildebrandt, "Defining Profiling: A New Type of Knowledge," in Profiling the European Citizen, ed. Marielle Hildebrandt and Serge Gutwirth (Berlin: Springer, 2008). Mathias Leese, "The New Profiling: Algorithm, Black Boxes, and the Failure of Anti-Discriminatory Safeguards in the European Union," Security Dialogue 45, no. 5 (2014). [^]
  16. Jacques Derrida, Specters of Marx. The State of the Debt, the Work of Mourning, & the New International (New York, London: Routledge, 1994). [^]
  17. Steffen Mau, Das metrische Wir. Über die Quantifizierung des Sozialen (Frankfurt a.M.: Suhrkamp, 2017). [^]
  18. Shoshana Zuboff, The Age of Surveillance Capitalism. The Fight for a Human Future at the New Frontier of Power (New York: Public Affairs, 2019). [^]
  19. Zuboff, The Age of Surveillance Capitalism, 96. [^]
  20. Zuboff, The Age of Surveillance Capitalism, 65. [^]
  21. Newell and Marabelli, "Strategic Opportunities (and Challenges) of Algorithmic Decision-Making: A Call for Action on the Long-Term Societal Effects of 'Datafication'," 4. [^]
  22. John Cheney-Lippold, "A New Algorithmic Identity. Soft Biopolitics and the Modulation of Control," Theory, Culture & Society 28, no. 6 (2011): 167–72. [^]
  23. Martin Edwards and Kirstin Edwards, Predictive HR Analytics: Mastering the HR Metric (London: Kogan Page, 2019). [^]
  24. Andrew McAfee and Erik Brynjolfsson, "Big Data: The Management Revolution," Harvard Business Review (October 2012): 5, https://hbr.org/2012/10/big-data-the-management-revolution. [^]
  25. Newell and Marabelli, "Strategic Opportunities (and Challenges) of Algorithmic Decision-Making: A Call for Action on the Long-Term Societal Effects of 'Datafication';" Wendy Hui Kyong Chun, "Queering Homophily," in Pattern Discrimination. In Search of Media, ed. Clemens Apprich et al. (Minneapolis: University of Minnesota Press, 2019). [^]
  26. Rob Kitchin, The Data Revolution. Big Data, Data Infrastructures & Their Consequences (Los Angeles: Sage, 2014), 21. [^]
  27. Jacques Derrida, "Force of Law: The 'Mystical Foundation of Authority'," in Deconstruction and the Possibility of Justice, ed. Drucilla Cornell, Michel Rosenfeld, and David Gray Carlson (London: Routledge, 1992), 24. [^]
  28. Louise Amoore, "Lines of sight: on the visualization of unknown futures," Citizenship Studies 13, no. 1 (2009): 29. [^]
  29. Lindebaum, Vesa, and den Hond, "Insights from 'The Machine Stops' to Better Understand Rational Assumptions in Algorithmic Decision-Making and Its Implications for Organizations," 248. [^]
  30. Derrida, "Force of Law: The 'Mystical Foundation of Authority'," 24. [^]
  31. Lisa Blackman, Haunted Data. Affect, Transmedia and Weird Science (London: Bloomsbury, 2016). [^]
  32. Cheney-Lippold, "A New Algorithmic Identity. Soft Biopolitics and the Modulation of Control," 171. [^]
  33. Mikkel Flyverbom and Anders K. Madsen, "Sorting Data Out: Unpacking Big Data Value Chains and Algorithmic Knowledge Production," in Die Gesellschaft der Daten. Über die digitale Transformation der sozialen Ordnung, ed. F. Süssenguth (Bielefeld: transcript, 2015). [^]
  34. Clemens Apprich et al., Pattern Discrimination (Minneapolis: University of Minnesota Press, 2019). [^]
  35. Geoffrey C. Bowker and Susan Leigh Star, Sorting Things Out (Cambridge, Mass.: MIT Press, 2000), 4. [^]
  36. Kitchin, The Data Revolution. Big Data, Data Infrastructures & Their Consequences, 9–10; Rob Kitchin, "Thinking Critically About and Researching Algorithms," Information, Communication and Society 20, no. 1 (2017): 18. [^]
  37. Ian Hacking, "Kinds of People: Moving Targets," in Proceedings of the British Academy, Volume 151, 2006 Lectures (2007). [^]
  38. Bernhard Rieder, "Scrutinizing an Algorithmic Technique: The Bayes Classifier as Interested Reading of Reality, Information," Communication and Society 20, no. 1 (2017). [^]
  39. Kitchin, "Thinking Critically About and Researching Algorithms," 17. [^]
  40. Hildebrandt, "Defining Profiling: A New Type of Knowledge," 18. [^]
  41. Andreas Bernard, The Triumph of Profiling. The Self in Digital Culture (Cambridge: Polity Press, 2019). [^]
  42. Bernard E. Harcourt, Against Prediction. Profiling, Policing, and Punishing in an Actuarial Age (Chicago: University of Chicago Press, 2007). [^]
  43. Colin Koopman, How We Became Our Data. A Genealogy of the Informational Person (Chicago: The University of Chicago Press, 2019). [^]
  44. George S. Odiorne, Strategic Management of Human Resources (San Francisco: Jossey Bass, 1984), 66. [^]
  45. Paula McDonald and Paul Thompson, "Social Media(tion) and the Reshaping fo the Public/Private Boundaries in Employment Relations," International Journal of Management Reviews, Vol 18, 2016: 69. [^]
  46. John Cheeney-Lippold, "A New Algorithmic Identity: Soft Biopolitics and the Modulations of Control," Theory, Culture & Society 28, no. 6 (2011): 164–81. [^]
  47. Hildebrandt, "Defining Profiling: A New Type of Knowledge," 11. [^]
  48. William Bogard, The Simulation of Surveillance. Hypercontrol in Telematic Societies (Cambridge: Cambridge University Press, 2010, 27. [^]
  49. Bogard, The Simulation of Surveillance, 75, emphasis in original. [^]
  50. James M. Harding, Performance, Transparency and the Cultures of Surveillance (Ann Arbor: University of Michigan Press, 2018), pos. 4061. [^]
  51. Avery F. Gordon, Ghostly Matters: Haunting and the Sociological Imagination (Minneapolis: University of Minnesota Press, 1997), 19. [^]
  52. Justine Grønbæk Pors, Lena Olaison, and Birke Otto, "Ghostly Matters in Organizing," Ephemera. Critical Dialogs on Organization 19, no. 1 (2019). [^]
  53. Blackman, Haunted Data. Affect, Transmedia and Weird Science. [^]
  54. In Specters of Marx, Derrida (1994, 126) explains that the production of the ghost, or the "constitution of the ghost effect" is more than the autonomization of thoughts, ideas, data, etc. "For there to be ghost, there must be a return to the body, but to a body that is more abstract than ever. The spectrogenic process corresponds therefore to a paradoxical incorporation. Once ideas or thoughts (Gedanken) are detached from their substratum, one engenders some ghost by giving them a body. Not by returning to the living body from which ideas and thoughts have been torn loose, but by incarnating the latter in another artifactual body, a prosthetic body". [^]
  55. Kevin D. Haggarty and Richard V. Ericson, "The Surveillant Assemblage," British Journal of Sociology 51, no. 4 (2000), 611. [^]
  56. Gordon, Ghostly Matters: Haunting and the Sociological Imagination, 19. [^]
  57. Michel Foucault, Discipline and Punish. The Birth of the Prison, trans. Alan Sheridan (London: Penguin Books, 1977), 184. [^]
  58. "Afterword by Michel Foucault: The Subject and Power," in Michel Foucault. Beyond Structuralism and Hermeneutics, ed. Hubert L. Dreyfus and Paul Rabinow (Chicago: University of Chicago Press, 1983), 212. [^]
  59. Hacking, "Kinds of People: Moving Targets," 367. [^]
  60. Judith Butler, The Psychic Life of Power. Theories in Subjection (Stanford: Stanford University Press, 1997), 90. [^]
  61. Hildebrandt, "Defining Profiling: A New Type of Knowledge;" Leese, "The New Profiling: Algorithm, Black Boxes, and the Failure of Anti-Discriminatory Safeguards in the European Union." [^]
  62. Apprich et al., Pattern Discrimination. [^]
  63. Mireille Hildebrandt, "Who Is Profiling Who? Invisible Visibility," in Reinventing Data Protection, ed. Serge Gutwirth (Berlin: Springer, 2009), 241. [^]
  64. Hans Krause Hansen, "Numerical Operations, Transparency Illusions and the Datafication of Governance," European Journal of Social Theory 18, no. 2 (2015): 203. [^]
  65. Chun, "Queering Homophily." [^]
  66. Marielle Hildebrandt, "Profile Transparency by Design? Re-Enabling Double Contingency," in Privacy, Due Process and the Computational Turn. The Philosophy of Law Meets the Philosophy of Technology, ed. Marielle Hildebrandt and Katja de Vries (Milton Park: Routledge, 2013), 221 (original emphasis). [^]
  67. Louise Amoore, "Data Derivatives: On the Emergence of a Security Risk Calculus for Our Time," Theory, Culture and Society 28, no. 6 (2011): 24–46. [^]
  68. Gilles Deleuze, Negotiations. 1972–1990 (New York: Columbia University Press, 1995), 180. [^]
  69. Hildebrandt, "Profile Transparency by Design? Re-Enabling Double Contingency," 227. [^]
  70. Amoore, "Lines of Sight: On the Visualization of Unknown Futures," 18. [^]
  71. Leese, "The New Profiling: Algorithm, Black Boxes, and the Failure of Anti-Discriminatory Safeguards in the European Union," 508. [^]
  72. Deleuze, Negotiations. 19721990, 179. [^]
  73. Mireille Hildebrandt, "Profiling: From Data to Knowledge," DuD. Datenschutz und Sicherheit 30, no. 9 (2006): 550; "Defining Profiling: A New Type of Knowledge," 27. [^]
  74. Benjamin Nathan Waber, People Analytics: How Social Sensing Technology Will Transform Business and What It Tells Us About the Future of Work (Upper Saddle River, N.J.: FT Press, 2013). [^]
  75. McAfee and Brynjolfsson, "Big Data: The Management Revolution," 5. [^]
  76. Amoore, "Data Derivatives: On the Emergence of a Security Risk Calculus for Our Time," 27. [^]
  77. Zuboff, The Age of Surveillance Capitalism. The Fight for a Human Future at the New Frontier of Power, 19. [^]
  78. David Lyon, "Surveillance, Snowden, and Big Data: Capacities, Consequences, Critique," Big Data & Society (July–December 2014): 7. [^]
  79. Jacques Derrida, As If I Were Dead. An Interview with Jacques Derrida/Als ob ich tot wäre. Ein Interview mit Jacques Derrida, trans. Ulrike Oudée Dinkelsbühler et al. (Vienna: Turia + Kant, 2000), 26. [^]
  80. Derrida, As If I Were Dead, 28. [^]
  81. Louise Amoore, The Politics of Possibility (Durham: Duke University Press, 2013), 147. [^]
  82. Ariane Ellerbrok, "Empowerment: Analysing Technologies of Multiple Variable Visibility," Surveillance & Society 8, no. 2 (2010). [^]
  83. Bruno Latour, Science in Action (Milton Keynes: Open University Press, 1987), 179–257. [^]
  84. Zuboff, The Age of Surveillance Capitalism. The Fight for a Human Future at the New Frontier of Power. [^]
  85. Matthew Crain, "The Limits of Transparency: Data Brokers and Commodification," Media and society 20, no. 1 (2018). [^]
  86. Frank Pasquale, The Black Box Society. The Secret Algorithms That Control Money and Information (Cambridge, Massachusetts: Harvard University Press, 2015), 32. [^]
  87. Hansen, "Numerical Operations, Transparency Illusions and the Datafication of Governance," 206. [^]
  88. Pasquale, The Black Box Society. The Secret Algorithms That Control Money and Information, 32. [^]
  89. David Lyon, "Surveillance as Social Sorting. Computer Codes and Mobile Bodies," in Surveillance as Social Sorting, ed. David Lyon (London, New York: Routledge, 2003), 20. [^]
  90. Leese, "The New Profiling: Algorithm, Black Boxes, and the Failure of Anti-Discriminatory Safeguards in the European Union," 504–05. [^]
  91. Leese, "The New Profiling: Algorithm, Black Boxes, and the Failure of Anti-Discriminatory Safeguards in the European Union," 505. [^]
  92. Paul B. De Laat, "Algorithmic Decision-Making Based on Machine Learning from Big Data: Can Transparency Restore Accountability," Philos. Technol. 31 (2018). [^]
  93. Zygmunt Bauman, Modernity and the Holocaust (Cambridge: Polity Press, 1991). [^]
  94. Bauman, Modernity and the Holocaust, 214. [^]
  95. Derrida, Specters of Marx. The State of the Debt, the Work of Mourning, & the New International, 7. [^]
  96. Greg Elmer, Profiling Machines: Mapping the Personal Information Economy (Cambridge, MA: MIT Press, 2003). [^]
  97. Foucault, Discipline and Punish. The Birth of the Prison. [^]
  98. Mikkel Flyverbom, The Digital Prism. Transparency and Managed Visibilies in a Datafied World (New York: Cambridge University Press, 2019). [^]
  99. Kitchin, The Data Revolution. Big Data, Data Infrastructures & Their Consequences, 167. [^]
  100. Ellerbrok, "Empowerment: Analysing Technologies of Multiple Variable Visibility." [^]
  101. Rouvroy, "The End(s) of Critique," 144–45. [^]
  102. Rouvroy, "The End(s) of Critique," 153. [^]
  103. Slavoj Žižek, The Ticklish Subject: The Absent Centre of Political Ontology (London: Verso, 1999), 260 (original modified, Žižek uses the term "mode of subjectivization"). [^]
  104. Michel Foucault, The History of Sexuality. Volume 1 (London: Penguin Books), 59. [^]
  105. Bernard E. Harcourt, Exposed. Desire and Disobedience in the Digital Age (Cambridge, Mass.: Harvard University Press, 2015). [^]
  106. Foucault, On the Government of the Living. Lectures at the Collège the France 19791980 and Oedipal Knowledge, 13. [^]
  107. Foucault, On the Government of the Living. Lectures at the Collège the France 19791980 and Oedipal Knowledge, 13. [^]
  108. Foucault, On the Government of the Living. Lectures at the Collège the France 19791980 and Oedipal Knowledge, 14. [^]
  109. Zuboff, The Age of Surveillance Capitalism. The Fight for a Human Future at the New Frontier of Power. [^]
  110. Alex Pentland, "The Data-Driven Society," Scientific American 309, no. 3 (2013): 10. [^]
  111. Michel Foucault, The Courage of Truth. The Government of Self and Others II. Lectures at the Collège De France 19831984 (Palgrave Macmillan, 2011), 15–31. Foucault here distinguishes four basic modalities of truth-telling (veridiction). [^]
  112. Zuboff, The Age of Surveillance Capitalism. The Fight for a Human Future at the New Frontier of Power, 433. [^]
  113. Zuboff, The Age of Surveillance Capitalism. The Fight for a Human Future at the New Frontier of Power, 254. [^]
  114. Rouvroy, "The End(s) of Critique. Data-Behaviourism Versus Due-Process." [^]
  115. Michel Foucault, Security, Territory, Population. Lectures as the Collège De France 197778 (London: Palgrave Macmillan, 2007), 56–59. [^]
  116. Foucault, Security, Territory, Population, 65. [^]

References

Amoore, Louise. "Data Derivatives: On the Emergence of a Security Risk Calculus for Our Time." Theory, Culture and Society 28, no. 6 (2011): 24–43. DOI:  http://doi.org/10.1177/0263276411417430

Amoore, Louise. "Lines of Sight: On the Visualization of Unknown Futures." Citizenship Studies 13, no. 1 (2009): 17–30. DOI:  http://doi.org/10.1080/13621020802586628

Amoore, Louise. The Politics of Possibility. Durham: Duke University Press, 2013. DOI:  http://doi.org/10.1215/9780822377269

Apprich, Clemens, Wendy Hui Kyong Chun, Florian Cramer, and Hito Steyerl. Pattern Discrimination. Minneapolis: University of Minnesota Press, 2019.

Ball, James. Post-Truth. How Bullshit Conquered the World. London: Biteback, 2017.

Bauman, Zygmunt. Modernity and the Holocaust. Cambridge: Polity Press, 1991.

Bernard, Andreas. The Triumph of Profiling. The Self in Digital Culture. Cambridge: Polity Press, 2019.

Blackman, Lisa. Haunted Data. Affect, Transmedia and Weird Science. London: Bloomsbury, 2016.

Bogard, William. The Simulation of Surveillance. Hypercontrol in Telematic Societies. Cambridge: Cambridge University Press, 2010.

Bowker, Geoffrey C., and Susan Leigh Star. Sorting Things Out. Cambridge, Mass.: MIT Press, 2000.

Butler, Judith. The Psychic Life of Power. Theories in Subjection. Stanford: Stanford University Press, 1997.

Cadwalladr, Carol. "The Cambridge Analytica Files. 'I Made Steve Bannon's Psychological Warfare Tool': Meet the Data War Whistleblower." The Guardian, March 18, 2018, https://www.theguardian.com/news/2018/mar/17/data-war-whistleblower-christopher-wylie-faceook-nix-bannon-trump.

Cheney-Lippold, John. "A New Algorithmic Identity. Soft Biopolitics and the Modulation of Control." Theory, Culture & Society 28, no. 6 (2011): 164–81. DOI:  http://doi.org/10.1177/0263276411424420

Chun, Wendy Hui Kyong. "Queering Homophily." In Pattern Discrimination. In Search of Media, edited by Clemens Apprich, Wendy Hui Kyong Chun, Florian Cramer and Hito Steyerl. Minneapolis: University of Minnesota Press, 2019.

Crain, Matthew. "The Limits of Transparency: Data Brokers and Commodification." Media and society 20, no. 1 (2018): 88–104. DOI:  http://doi.org/10.1177/1461444816657096

De Laat, Paul B. "Algorithmic Decision-Making Based on Machine Learning from Big Data: Can Transparency Restore Accountability." Philos. Technol (2018): 525–541. DOI:  http://doi.org/10.1007/s13347-017-0293-z

Deleuze, Gilles. Negotiations. 1972–1990. New York: Columbia University Press, 1995.

Derrida, Jacques. As If I Were Dead. An Interview with Jacques Derrida/Als ob ich tot wäre. Ein Interview mit Jacques Derrida. Translated by Ulrike Oudée Dinkelsbühler et al. Vienna: Turia + Kant, 2000.

Derrida, Jacques. "Force of Law: The 'Mystical Foundation of Authority'." In Deconstruction and the Possibility of Justice, edited by Drucilla Cornell, Michel Rosenfeld and David Gray Carlson, 3–67. London: Routledge, 1992.

Derrida, Jacques. Specters of Marx. The State of the Debt, the Work of Mourning, & the New International. New York, London: Routledge, 1994.

Edwards, Martin, and Kirstin Edwards. Predictive HR Analytics: Mastering the Hr Metric. London: Kogan Page, 2019.

Ellerbrok, Ariane. "Empowerment: Analysing Technologies of Multiple Variable Visibility." Surveillance & Society 8, no. 2 (2010): 200–20. DOI:  http://doi.org/10.24908/ss.v8i2.3486

Elmer, Greg. Profiling Machines: Mapping the Personal Information Economy. Cambridge, MA: MIT Press, 2003. DOI:  http://doi.org/10.7551/mitpress/5614.001.0001

Flyverbom, Mikkel. The Digital Prism. Transparency and Managed Visibilies in a Datafied World. New York: Cambridge University Press, 2019. DOI:  http://doi.org/10.1017/9781316442692

Flyverbom, Mikkel, and Anders K. Madsen. "Sorting Data Out: Unpacking Big Data Value Chains and Algorithmic Knowledge Production." In Die Gesellschaft der Daten. Über die digitale Transformation der sozialen Ordnung, edited by F. Süssenguth. Bielefeld: transcript, 2015.

Foucault, Michel. "Afterword by Michel Foucault: The Subject and Power." In Michel Foucault. Beyond Structuralism and Hermeneutics, edited by Hubert L. Dreyfus and Paul Rabinow, 208–28. Chicago: University of Chicago Press, 1983.

Foucault, Michel. Discipline and Punish. The Birth of the Prison. Translated by Alan Sheridan. London: Penguin Books, 1977.

Foucault, Michel. On the Government of the Living. Lectures at the Collège the France 1979–1980 and Oedipal Knowledge. New York: Picador, 2016.

Foucault, Michel. "Questions of Method." In The Foucault Effect, edited by Graham Burchell, Colin Gordon and Peter Miller, 73–86. London: Harvester Wheatsheaf, 1991.

Foucault, Michel. Security, Territory, Population. Lectures as the Collège De France 1977–78. London: Palgrave Macmillan, 2007.

Foucault, Michel. The Courage of Truth. The Government of Self and Others II. Lectures at the Collège De France 1983–1984. Palgrave Macmillan, 2011. DOI:  http://doi.org/10.1057/9780230274730

Foucault, Michel. The History of Sexuality. Volume 1. Translated by Robert Hurley. London: Penguin Books, 1981.

Gandy, Oscar H. The Panoptic Sort: A Political Economy of Personal Information. Critical Studies in Communication and in the Cultural Industries. Bolder: Westview Press, 1993. DOI:  http://doi.org/10.1080/15295039309366849

Gordon, Avery F. Ghostly Matters: Haunting and the Sociological Imagination. Minneapolis: University of Minnesota Press, 1997.

Hacking, Ian. "Kinds of People: Moving Targets." In Proceedings of the British Academy, Volume 151, 2006 Lectures, 285–318, 2007. DOI:  http://doi.org/10.5871/bacad/9780197264249.003.0010

Haggarty, Kevin D., and Richard V. Ericson. "The Surveillant Assemblage." British Journal of Sociology 51, no. 4 (2000): 605–22. DOI:  http://doi.org/10.1080/00071310020015280

Hansen, Hans Krause. "Numerical Operations, Transparency Illusions and the Datafication of Governance." European Journal of Social Theory 18, no. 2 (2015): 203–20. DOI:  http://doi.org/10.1177/1368431014555260

Harcourt, Bernard E. Against Prediction. Profiling, Policing, and Punishing in an Actuarial Age. Chicago: University of Chicago Press, 2007. DOI:  http://doi.org/10.7208/chicago/9780226315997.001.0001

Harcourt, Bernard E. Exposed. Desire and Disobedience in the Digital Age. Cambridge, Mass.: Harvard University Press, 2015. DOI:  http://doi.org/10.4159/9780674915077

Harding, James M. Performance, Transparency and the Cultures of Surveillance. Ann Arbor: University of Michigan Press, 2018.

Hildebrandt, Mireille. "Defining Profiling: A New Type of Knowledge." Chap. 2 In Profing the European Citizen, edited by Marielle Hildebrandt and Serge Gutwirth, 17–30. Berlin: Springer, 2008. DOI:  http://doi.org/10.1007/978-1-4020-6914-7_2

Hildebrandt, Marielle. "Profile Transparency by Design? Re-Enabling Double Contingency." In Privacy, Due Process and the Computational Turn. The Philosophy of Law Meets the Philosophy of Technology, edited by Marielle Hildebrandt and Katja de Vries, 221–46. Milton Park: Routledge, 2013. DOI:  http://doi.org/10.4324/9780203427644

Hildebrandt, Mireille. "Profiling: From Data to Knowledge." DuD. Datenschutz und Sicherhet 30, no. 9 (2006): 548–52. DOI:  http://doi.org/10.1007/s11623-006-0140-3

Hildebrandt, Mireille. "Who Is Profiling Who? Invisible Visibility." In Reinventing Data Protection, edited by Serge Gutwirth, 239–52. Berlin: Springer, 2009. DOI:  http://doi.org/10.1007/978-1-4020-9498-9_14

Kitchin, Rob. The Data Revolution. Big Data, Data Infrastructures & Their Consequences. Los Angeles: Sage, 2014. DOI:  http://doi.org/10.4135/9781473909472

Kitchin, Rob. "Thinking Critically About and Researching Algorithms." Information, Communication and Society 20, no. 1 (2017): 14–29. DOI:  http://doi.org/10.1080/1369118X.2016.1154087

Koopman, Colin. How We Became Our Data. A Genealogy of the Informational Person. Chicago: The University of Chicago Press, 2019. DOI:  http://doi.org/10.7208/chicago/9780226626611.001.0001

Latour, Bruno. Science in Action. Milton Keynes: Open University Press, 1987.

Leese, Mathias. "The New Profiling: Algorithm, Black Boxes, and the Failure of Anti-Discriminatory Safeguards in the European Union." Security Dialogue 45, no. 5 (2014): 49–511. DOI:  http://doi.org/10.1177/0967010614544204

Lindebaum, Dirk, Mikko Vesa, and Frank den Hond. "Insights from 'The Machine Stops' to Better Understand Rational Assumptions in Algorithmic Decision Making and Its Implications for Organizations," Academy of Management Review, no. 1 (January 2020): 247–263. DOI:  http://doi.org/10.5465/amr.2018.0181

Lyon, David. "Surveillance as Social Sorting. Computer Codes and Mobile Bodies." In Surveillance as Social Sorting, edited by David Lyon. London, New York: Routledge, 2003.

Lyon, David. "Surveillance, Snowden, and Big Data: Capacities, Consequences, Critique." Big Data & Society July–December (2014): 1–13. DOI:  http://doi.org/10.1177/2053951714541861

Mau, Steffen. Das metrische Wir. Über die Quantifizierung des Sozialen. Frankfurt a.M.: Suhrkamp, 2017.

Mayer-Schönberger, Viktor, and Kenneth Cukier. Big Data. A Revolution That Will Transform How We Live, Work and Think. London: John Murray, 2013.

McAfee, Andrew, and Erik Brynjolfsson. "Big Data: The Management Revolution." Harvard Business Review (October 2012): 1–16. https://hbr.org/2012/10/big-data-the-management-revolution.

McDonald, Paula, and Paul Thompson, "Social Media(tion) and the Reshaping fo the Public/Private Boundaries in Employment Relations." International Journal of Management Reviews, 18 (2016): 69–84. DOI:  http://doi.org/10.1111/ijmr.12061

Newell, Sue, and Marco Marabelli. "Strategic Opportunities (and Challenges) of Algorithmic Decision-Making: A Call for Action on the Long-Term Societal Effects of 'Datafication'." Journal of Strategic Information Systems 24 (2015): 3–14. DOI:  http://doi.org/10.1016/j.jsis.2015.02.001

Odiorne, George S. Strategic Management of Human Resources. San Francisco: Jossey Bass, 1984.

Orwell, George. Nineteen Eighty Four. Essex: Heinemann, 1949/1990.

Pasquale, Frank. The Black Box Society. The Secret Algorithms That Control Money and Information. Cambridge, Massachusetts: Harvard University Press, 2015. DOI:  http://doi.org/10.4159/harvard.9780674736061

Pentland, Alex. "The Data-Driven Society." Scientific American 309, no. 3 (2013): 78–83. DOI:  http://doi.org/10.1038/scientificamerican1013-78

Pors, Justine Grønbæk, Lena Olaison, and Birke Otto. "Ghostly Matters in Organizing." Ephemera. Critical Dialogs on Organization 19, no. 1 (2019): 1–29.

Rieder, Bernhard. "Scrutinizing an Algorithmic Technique: The Bayes Classifier as Interested Reading of Reality, Information." Communication and Society 20, no. 1 (2017): 100–17. DOI:  http://doi.org/10.1080/1369118X.2016.1181195

Rouvroy, Antoinette. "The End(s) of Critique. Data-Behaviourism Versus Due-Process." In Privacy, Due Process and the Computational Turn. Philosophers of Law Meet Philosophers of Technology, edited by Marielle Hildebrandt and Katja de Vries, 143–68. London: Routledge, 2013.

Sprinker, Michael, ed. Ghostly Demarcations. A Symposium on Jacques Derrida's Specters of Marx. London: Verso, 1999.

Waber, Benjamin Nathan. People Analytics: How Social Sensing Technology Will Transform Business and What It Tells Us About the Future of Work. Upper Saddle River, N.J.: FT Press, 2013.

Youyou, Wu, Michal Kosinski, and David Stillwell. "Computer-Based Personality Judgments Are More Accurate Than Those Made by Humans." PNAS 112, no. 4 (2015): 1036–40. DOI:  http://doi.org/10.1073/pnas.1418680112

Žižek, Slavoj. The Ticklish Subject: The Absent Centre of Political Ontology. London: Verso, 1999.

Zuboff, Shoshana. The Age of Surveillance Capitalism. The Fight for a Human Future at the New Frontier of Power. New York: Public Affairs, 2019.