Reimagining Privacy Online Through A Spectrum of Intimacy
November 12, 2019
Caroline Sinders and Hyphen-Labs
This essay was written in conjunction with Higher Resolution, an exhibition by Hyphen Labs and Caroline Sinders for the Tate museum’s Tate Exchange program. The exhibition was on view in September 2019.
AI is affecting society in a myriad of ways, and it’s changing the face of technology. What makes AI important is that it’s not just one thing, but rather a combination of data, human input, product design, and the machine learning algorithms used. Put simply, AI is the combination of a certain kind of technology (an algorithm) and a large data set. Crucially, that data often comes from people.
Data is a variety of inputs that people make, such as what they choose to like online, what they view or buy, how often they use something, and when they use something. This data is fed into AI and machine learning systems that exist within products, social networks, platforms, and websites.
The Higher Resolutions exhibition that ran for two weeks in September 2019 at the Tate museum seeks to make AI more understandable by breaking it down into pieces of a system. Part of the exhibition is an original essay that we physically installed in the space. This essay uses physical structures as metaphors to describe the spectrum of privacy and intimacy in digital spaces, and it explores how users’ thoughts and conversations play out in communication apps and social networks.
In this moment in software design, privacy settings are extremely binary (public or private) even as our conversation and emotions exist within gradients of intimacy and in a myriad of in-betweens. Not every conversation we have is simply public or private; the intimacy of a conversation changes with the context of where that conversation exists or what the conversation is about. What we say in a classroom can differ than what we say at a townhall or in a public protest or with a partner or a family member. The lack of privacy gradients in the design of our social networks, online communication platforms, and apps facilitates everything from harassment to violations of user privacy.
Privacy manifested through designing better channels of intimacy—like platfoms design that allows for the engagement of smaller groups (i.e. private Slack channels or Facebook groups) or settings that allow users to share information with less people—are as important for protecting users as privacy and security protocols. Not all thoughts or conversations should be entirely public. There should be spaces for private and semi-private thoughts. The physical structures of the exhibition—the town hall, the park bench, the living room, and the loo—are designed as metaphors for describing the different areas of online intimacy, from the most public to the most private.
We’re not the first to find offline metaphors has a helpful aid in illustrating how to design for communities and individuals in digital spaces. During her talk at Hardwired NYC in March, Michelle Cortese described a similar theory for designing in VR and AR, which she will explore further in a forthcoming book co-written with Andrea Zeller. Cortese and Zeller maintain that harassment is a burgeoning problem in VR and that the problem is amplified by the design of VR spaces.
As Cortese explained in her talk, one challenge in addressing harassment in VR is that things feel extremely real. Similarly, digital spaces or social networks like Twitter and Facebook take on a certain “realness” as well. Our online lives are as real as our offline lives—we fall in love online, we make friends online. Our offline and online lives are incredibly intertwined. Cortese highlights proxemics as a way to think about users and designing for spaces around those users. Proxemics, a term coined by anthropologist Edward T. Hall, defines the relationships between a person and their identity, their surroundings, and the social norms of the community around a person or individual. There are four zones in proxemics: the intimate, the personal, the social, and the public space.
Cortese and Zeller coined the term “body sovereignty” to guide VR creators in conceptualizing how to design safe and inclusive spaces. In essence, they are defining a need for user agency and user consent in digital spaces. Body sovereignty also explains the necessity for users to be able to move between the variations of public and private in digital spaces, be it a social network, a massive multi-online player game (MMO), or a VR setting.
Cortese and Zeller have a similar theory to ours: that by looking at offline consent mechanisms, we can design better online consent mechanisms in open VR spaces. Or, as our Higher Resolution exhibit shows, using non-digital spatial design and architecture as metaphors can act as guidance towards thinking about privacy in online spaces and how to create privacy gradients and user consent.
Our privacy and intimacy metaphors:
The town hall
is a digital gathering place that is the most public, somewhat like Twitter. This is where we shout our thoughts or share things we don’t mind thousands of people seeing. The town hall is a public square for speaking loudly and deliberately. Your thoughts can spread virally; They will be heard, amplified and sometimes misinterpreted.
The park bench
is semi-public. It’s like walking down the street and engaging in conversation with a coworker or friend, or having a discussion on the tube or in a pub—is a space where anyone can have a conversation between two or a few people, but that conversation takes place in public. Those in the conversation can control who hears it by lowering their voice or walking to a less populated area. This setting is like Facebook: the content you put on Facebook cannot be accessed outside of Facebook, unlike Twitter, Sina Weibo, and others. This little bit of friction creates a higher level of intimacy than the town square, and the result is that it feels slightly more private. Depending upon a user’s settings, content or conversations can be accessed only by people on Facebook (quite a large amount), only by a user’s friends, or only by their friends’ friends.
The next metaphor,
the living room
, highlights a shift into the “privacy” end of the spectrum, with the town hall and park bench being “public” entities. It’s semi-private, but can also host large groups and conversations that are designed to be public, private, or in-between. This setting allows for more intimacy because it allows for a smaller group. This design functions much like a salon or a group gathered for lively debate. The living room is a metaphor for a closed Facebook group or a WhatsApp chat group.
is most the private of the intimacy metaphors, and the most intimate place for conversations and activities. This is like a private DM or a text message between one or two friends or family members. It is a space to share your thoughts. Secrets are welcomed, and comfortably kept. One can also think of this metaphor as the “bedroom,” an equally intimate space where only a few people are invited in.
Our metaphors will not work as a literal guidepost for solving every problem within digital conversations, but we offer these as provocations for looking at how the form and design of a space creates the affordances and functions in that space.
To start creating solutions for online harassment, tracking, and targeting in social networks, and to create better protections for users online, communication apps and online technology need to think of privacy not just as a security protocol, but as an intimate setting—and something that is already an organic part of our lives. This privacy needs to be designed into how conversations unfold. In practice, this could mean better privacy filters to create small and large groups easily, the ability to turn off comments or replies, the ability to easily share posts or content with a handful of people, and security protocols that protect user’s data and online behavior.
We need town halls, park benches, living rooms, and loos online in every platform and every piece of technology that hosts social interactions.
is a machine learning design researcher and artist. For the past few years, she has been focusing on the intersections of natural language processing, artificial intelligence, abuse, online harassment and politics in digital, conversational spaces. Caroline is the founder of Convocation Design + Research, a design and research agency focusing on the intersections of machine learning, user research, designing for public good, and solving communication difficult problems. As a designer and researcher, she's worked with groups like Amnesty International, Intel, IBM Watson, the Wikimedia Foundation as well as others.
Caroline has held residencies and fellowships with Google's PAIR (People and Artificial Intelligence Research lab) as a writer in residence, the Yerba Buena Centers of the Arts, Eyebeam, the Studio for Creative Inquiry and the International Center of Photography. Her work has been featured at MoMA PS1, the Houston Center for Contemporary Art, Slate, Quartz, the Channels Biennale, as well as others. Caroline holds a masters from New York University's Interactive Telecommunications Program.
is an international collective working at the intersection of technology, art, science, and the future. Through their global vision and multi-disciplinary backgrounds they are driven to create engaging ways to explore planetary-centered design. In the process they challenge conventions and stimulate conversations, placing collective needs and experiences at the center of evolving narratives.
Learn about how people use Are.na to do work and pursue personal projects through case studies, interviews, and highlights.