It gives me peace of mind to think that maybe, just maybe, my digital soul will survive into the far future — long after my meatspace body has decomposed.
Before I started posting, my thoughts were more abstract and non-verbal — like blobs of play-doh floating around a zero gravity chamber. I used to spend a lot of time re-molding these amorphous thought forms into Twitter-friendly nuggets. But nowadays my internal monologue speaks tweet by default. Thoughts bubble up from the depths of my psyche readymade for the timeline, already twisted into the pre-programmed shape of a Post. I wonder if the algorithm is starting to interfere with the way my subconscious works.
Sometimes when I wake up in the morning, I’ll scroll through my old posts just to remind myself of myself. It feels like looking in the mirror. I’m swallowing my (digital) self so that I’m me instead of someone else.
When I look at the gradient of a beautiful sunset, or the hairline cracks in a concrete sidewalk, or the murky texture of a cumulonimbus cloud, I can’t help but think about how hard it’d be to make something like that in Unity. It’s almost like the world is a big game engine and God is the supernatural renderer. Maybe one of these days I’ll turn around too quickly and spot a glitch.
When I’m hanging out with friends who spend a lot of time on the app, we can basically speak a whole different language together. We’ve memorized enough videos that we can have a whole ass conversation using only obscure quotes and dances. It’s a lot more full bodied than regular English. Sometimes I forget how dense the memes are until I try to explain them to my parents and realize that there are layers and layers of references and duets compressed into one 60 second vid. I vibe a lot better with people who are steeped in TikTok culture because the emotions they express are usually based on popular trends, which makes them way easier for me to relate to.
But my digitally augmented memory is a dumb nostalgia machine that dispenses pellets of my past to boost engagement numbers. Whenever I take a photo, I’m always thinking about how my future self is going to end up consuming it as a memory.
Like, the other day, I went to send a text to a friend that I haven’t talked to in a while. I was expecting a blank canvas, but instead our thread was polluted by an awkward conversation we had 4 years ago. I’m strung out across time, haunted by the ghosts of my old messages, statuses, photos, videos. Another weird thing about social media is that when you change your profile pic, it also changes the profile pic on all your old posts. It’s super jarring to see something I wrote a long time ago right next to a picture of what I look like today. That photo of me next to those words… they aren’t even the same people!
It’s easy to lie to yourself about what you really like and who you really are, but your recommendation algorithms and search history keep you honest. It took me a solid 20 years to figure out that I’m gay but the TikTok algorithm figured it out in like 47 minutes lol.
At this point I can almost smell the demographic labels I’ve been tagged with, and it’s honestly kind of comforting to feel known. Who needs astrology or Myers Briggs when you have all these apps?
I know you spoke with Audrey Tang, the Digital Minister of Taiwan, and I think the work that she's doing there, and we've interviewed here for our podcast as well represents. Really thinking about how to reboot the core principles of democracy, but in a digital way for the 21st century, under the threat of China trying to sow disinformation in Taiwan and being able to do so reasonably successfully and producing a more coherent society. And you've always said the goal of democracy and information technologies, isn't just connecting people because it's an interesting that as soon as we connected people everywhere, the most popular technology in the world to build was stonewalls. The real goal should be harmonizing people. And I think that goal is a really wise one and rediscovering what we really want here, because to maybe take it full circle, this is Aza's line from the past. If you go back to our original problem statement that we started with this interview, that the problem of humanity is our paleolithic emotions, medieval institutions, and God-like technology that the answer might be something like we have to understand and embrace our paleolithic emotions. We have to upgrade our medieval institutions and philosophy, and we have to have the wisdom to guide our God-like technology.
Elections is one way to safeguard that when every person has a vote and can express his or her opinions. But there are other important tools like separation of powers, the court should be independent, the media should be independent, like basic civil and human rights which cannot be violated even if the majority is in favor of that. That's at least as important as having elections, if not more important. And what's happening now is that this traditional tool of elections become even more problematic because it's becoming increasingly easy to manipulate it.
When you build a massive system based on surveillance and data processing, it's the kind of system that by definition, a human being cannot understand. So you are building a system that inevitably will escape your, not just your control, your understanding
I would say, we see the collapse of nationalism. I talked earlier about the positive side of nationalism. Nationalism, not as hatred of foreigners in minorities, but nationalism as feeling solidarity with millions of strangers in your country that you care about them, you feel that you share interests with them. So for instance, you are willing to pay taxes so that they will have good health care and education. And we are seeing the collapse of this kind of nationalism, all over the world. And many leaders that present themselves as nationalists like Donald Trump or Bolsonaro, they are actually anti nationalists. They are doing their best to destroy the national community and the bonds of national solidarity. We have reached a point in the US when Americans are more afraid of each other than they are afraid of anybody else on the planet.
I think also, Yuval you brought up the point about... The temptation to see under the skin with COVID for governments to want to verify, "Okay, are you actually on lockdown for those 14 days, I'm going to want to know more about whether you are sick or not sick, and whether you've been moving or not moving." And the problem is once you grant either governments or technology companies that power to know all these things about us and to share it for the "greater good," it can also be used for evil. So we have to be very careful about what we allow companies to know about us. But I think the thing, Yuval that I think really is really the sweet spot of intersection between your work and ours is that technology actually is already beneath the skin. And I think that Aza and I have been tracking several examples of the ability to predict things about you without making an actual insertion underneath the skin layer. And I would say more than getting underneath our skin, they can get underneath the future. They can find and predict things about us that we won't know about ourselves. The Gottmans have done research that with three minutes of videotape, you take the audio out of two couples talking to each other, you can predict whether they will be together with something like 70% accuracy with just three minutes of silent videotape. You can predict actually whether someone's about to commit suicide. You can predict divorce rates of couples. You can predict whether someone is going to have an eating disorder based on their click patterns. You can predict, as you said, Yuval in examples of your own work, you can predict someone's sexuality before that person might even know their own sexuality. You can actually predict... IBM has a piece of technology that can predict whether employees are going to be quitting their jobs with 95% accuracy, and they can actually intervene ahead of time. And so I think one of the interesting things is when I know your next move better than you know your next move, and I can get not just underneath your skin, not just underneath your emotions, but underneath the future. I know the future. I know a future that's going to happen before you know it's going to happen. And it's like the Oracle in the matrix saying, "Oh, and by the way, Neo, don't worry about the vase." And he turns around and he says, "What vase?" And he knocks the vase over and she says, "Well, the interesting question is, would you have knocked it over if I hadn't said anything?" She's not only predicting the future, she's vertically integrating into creating that reality because she knows that that move is available for her.
Yuval Noah Harari: And for the last 200 years or so, human feelings became the ultimate source of authority in ethics, in politics, in art, in economics. So the customer is always right is exactly that. And you have this big corporations that when you push them to the wall and you tell them, "You're doing all these terrible things, you're creating, I don't know, SUVs that pollute the environment." And the corporation would say, "Well, don't blame us. We are just doing whatever the customers want. If you have a problem, go to the customers and actually go to the feelings of the customers. We can't tell the customers what to feel."
Tristan Harris: And the same is true in Facebook if you say like, if people are clicking on those extremist groups or going into QAnon or clicking on hyper extremist content. "Why are you blaming us? We're just an empty corporation. We're a neutral mirror waiting for people to click on whatever they think is best."
Yuval Noah Harari: Even more than that, they have to... "Who are you to tell people what to click on?" They are presumably clicking on these things from their own free will. It's because they feel good about it. You're at some kind of big brother who thinks that you understand what's good for them better than them. Of course it's a manipulation because we know it doesn't work like that. And we know that not only today, also in the past, but especially today, humans have been hacked. And now when governments and corporations and other organizations have the power to manipulate human feelings, then this whole system has reached an extremely dangerous point. If the ultimate authority in the world is human feeling, but somebody has discovered how to hack and manipulate human feelings, then the whole system collapses.
The very notion of a nation is itself a fictional story. It's not an objective truth. Nations are not biological or physical entities. They are imagined realities. They are stories that exist only in our own minds. A mountain or a river is an objective physical entity. You can see it. You can bathe in the river. You can listen to the murmur of the waves in the Mississippi. United States is not a physical reality. You cannot see the United States. You can see the Mississippi River, but that's not the United States. The Mississippi River was there two million years ago, the United States wasn't. The United States might disappear in 200 years or 500 years, the Mississippi River will probably still be there. So it's not a physical entity. It's a story.
Tristan Harris: I think one of the key points here in your work is it's not about telling bigger and bigger more complex truths that unite us, it's as you said, it's not E equals MC squared, it's actually simple fictions that are able to tell us we will go to monkey heaven if we... Or whatever the different stories that we can get ourselves to believe, cohere us.
Yuval Noah Harari: It's not the truth. You don't need to tell the truth in order to get a lot of people to cooperate. You need a good story. The story could be completely ridiculous, but if enough people believe it, it works.