more from the
July/August 2014 issue:
Picturing the Self
in the Age Of Data
by Dan Weiskopf
It Was the Best of Times,
It Was the Worst of Times:
Reaches an Apex in
the American South
by Stephanie Bailey
by Will Becker
INTERVIEW: Rob Horning
Rob Horning is executive editor of the online platform The New Inquiry and author of the Marginal Utility blog. He has written
extensively on social media and the sociology of consumption. Here he discusses the impact of technology on our identities, and
the ways in which social media make possible new forms of subjectivity.
Zach Blas, Facial Weaponization Suite: Militancy, Vulnerability, Obfuscation, tableau vivant, June 7, 2013, San Diego, CA [courtesy of the artist, photo: Tanner Cook]
Dan Weiskopf: The idea used to be that each of us has an authentic self, defined by control over its boundaries,
privacy, and autonomy. How do networked social practices subvert this ideology of individual authenticity?
Rob Horning: Authenticity matters only when people are onstage having their behavior evaluated. Social media make it
more obvious that authenticity is a set of practices instead of a state of being. In many ways, social media provide more
control over staging the self rather than less, revealing authenticity as an on-demand project rather than a spontaneous
expression of inner truth. Social media also broaden the ways the "authenticity" confirmation can be delivered. It becomes
a matter of metrics—the quantified response we get to our mediated gestures, and also the way the media feeds we consume are
reshaped according to our behavior.
DW: In our everyday, small-group, face-to-face, interactions, we always search for hints of whether we are being
socially included. Did people smile at what I said? Are they looking at me or turning their backs? Digital interactions
have an astonishing ability to co-opt these pre-existing social reward systems. Doesn't it seem that our sense of how well
we are succeeding at "performing ourselves" is indifferent to the channel over which these affirmations come, whether
it's a "like" or a "retweet" online or a thumbs up in real life?
RH: It may be that "reward signals" are fungible and have no essential content, but are instead produced by systems
of engagement. In-person social interaction is one system for yielding rewards, but not necessarily the "real" one that
gets co-opted by other systems. Arguably, social media and other pseudosocial reward-delivery systems are more psychologically
significant than face-to-face interaction, because the presence of the other limits the degree to which we can solipsistically
plug in to our own internal reward mechanisms.
DW: The prospect of constant evaluation is central to most of these social platforms. The little status box is always
sitting there like a hungry mouth, and if you feed it the system can potentially dispense tingles of approval at any time.
It's always on, and always empty—which sort of mirrors the vague emptiness we feel in the absence of any external guidance
about how we ought to be. It's fascinating: if you just give people these little boxes, they will type literally anything into
them. Zygmunt Bauman has even suggested that "we seem to experience no joy in having secrets" anymore. The pleasure of privacy
has become eroded by the pleasures of constant disclosure.
RH: Yes, I think that is right. An internal emptiness is evoked and confirmed by the emptiness of profiles, which are
experienced also as open-ended "freedom" from responsibility to others—which is what ultimately limits the emptiness, shapes
it through constraints, and makes it something within which we can account for ourselves. No one can see themselves in limitlessness,
in the infinite. But I don't know if this has anything to do with secrecy or privacy; the sense of limits is not a matter of keeping
things back from others or developing a negative theology of the self that says the "real" me is what can't be shared in social media.
It's more that the self becomes real, the individual becomes delimited, through discrete social interactions that permit the individual
to emerge from the social backdrop necessary to foster individuality.
DW: Our classifications are rarely inert: often, people's behavior is reshaped by being assigned to a particular social category
that has a label and a set of expected behaviors and traits associated with it. So, what happens when the classifications invented for
human beings are not created by other people—clinicians, statisticians, and bureaucrats—but rather by algorithms? This is an unprecedented
kind of interaction, and the categories that these systems come up with may correspond to ones that no person ever created, because they're
products of predictive algorithms operating over amounts of data so vast no person could survey them. Who knows how these systems see
the world, what categories they operate with, or how these classifications could shape us?
RH: That reminds me of Netflix's efforts to create user-specific genres using algorithms that analyze one's viewing and browsing
history. It's a way to make taste—which is inherently social, like the experience of individuality—into something that can function
without any shared experiences with others. You have a genre that's only for you; no one else understands its conventions, and it
forms no collective audiences with shared expectations. Instead, it isolates. What Netflix accomplishes by trying to atomize taste
is to give users a different game to play with their identity that is nonsocial, that consists of manipulating decontextualized toy
blocks into effigies of their identity—catering to the dream, if anyone really wants this, of a nonsocial self that can be built and
entertainingly toyed with, without the risk of others' judgment. Algorithms stand in for the approving other in the circuit of self-production.
DW: A self shaped by inscrutable categories embodied in software seems even more insidious than conscious attempts to
craft a personal brand!
RH: I'm not sure which is worse. Self-branding demands one to inhabit an anxious, defensive sort of subjectivity, whereas
the subjectivity that derives from being immersed in an algorithmically shaped universe at least offers pleasures, albeit passive ones.
It seems better to consume oneself as a product rather than struggle to make oneself into a product.
Dan Weiskopf is Associate Professor of Philosophy and an associate member
of the Neuroscience Institute at Georgia State University. His research focuses
on the nature of representation in cognition, science, and art.
Read the feature article that accompanied this interview:
Picturing the Self in the Age Of Data
by Dan Weiskopf