It gives me peace of mind to think that maybe, just maybe, my digital soul will survive into the far future — long after my meatspace body has decomposed.

Before I started posting, my thoughts were more abstract and non-verbal — like blobs of play-doh floating around a zero gravity chamber. I used to spend a lot of time re-molding these amorphous thought forms into Twitter-friendly nuggets. But nowadays my internal monologue speaks tweet by default. Thoughts bubble up from the depths of my psyche readymade for the timeline, already twisted into the pre-programmed shape of a Post. I wonder if the algorithm is starting to interfere with the way my subconscious works.

Sometimes when I wake up in the morning, I’ll scroll through my old posts just to remind myself of myself. It feels like looking in the mirror. I’m swallowing my (digital) self so that I’m me instead of someone else.

When I look at the gradient of a beautiful sunset, or the hairline cracks in a concrete sidewalk, or the murky texture of a cumulonimbus cloud, I can’t help but think about how hard it’d be to make something like that in Unity. It’s almost like the world is a big game engine and God is the supernatural renderer. Maybe one of these days I’ll turn around too quickly and spot a glitch.

When I’m hanging out with friends who spend a lot of time on the app, we can basically speak a whole different language together. We’ve memorized enough videos that we can have a whole ass conversation using only obscure quotes and dances. It’s a lot more full bodied than regular English. Sometimes I forget how dense the memes are until I try to explain them to my parents and realize that there are layers and layers of references and duets compressed into one 60 second vid. I vibe a lot better with people who are steeped in TikTok culture because the emotions they express are usually based on popular trends, which makes them way easier for me to relate to.

But my digitally augmented memory is a dumb nostalgia machine that dispenses pellets of my past to boost engagement numbers. Whenever I take a photo, I’m always thinking about how my future self is going to end up consuming it as a memory.

Like, the other day, I went to send a text to a friend that I haven’t talked to in a while. I was expecting a blank canvas, but instead our thread was polluted by an awkward conversation we had 4 years ago. I’m strung out across time, haunted by the ghosts of my old messages, statuses, photos, videos. Another weird thing about social media is that when you change your profile pic, it also changes the profile pic on all your old posts. It’s super jarring to see something I wrote a long time ago right next to a picture of what I look like today. That photo of me next to those words… they aren’t even the same people!

It’s easy to lie to yourself about what you really like and who you really are, but your recommendation algorithms and search history keep you honest. It took me a solid 20 years to figure out that I’m gay but the TikTok algorithm figured it out in like 47 minutes lol.

At this point I can almost smell the demographic labels I’ve been tagged with, and it’s honestly kind of comforting to feel known. Who needs astrology or Myers Briggs when you have all these apps?


I think also, Yuval you brought up the point about... The temptation to see under the skin with COVID for governments to want to verify, "Okay, are you actually on lockdown for those 14 days, I'm going to want to know more about whether you are sick or not sick, and whether you've been moving or not moving." And the problem is once you grant either governments or technology companies that power to know all these things about us and to share it for the "greater good," it can also be used for evil. So we have to be very careful about what we allow companies to know about us. But I think the thing, Yuval that I think really is really the sweet spot of intersection between your work and ours is that technology actually is already beneath the skin. And I think that Aza and I have been tracking several examples of the ability to predict things about you without making an actual insertion underneath the skin layer. And I would say more than getting underneath our skin, they can get underneath the future. They can find and predict things about us that we won't know about ourselves. The Gottmans have done research that with three minutes of videotape, you take the audio out of two couples talking to each other, you can predict whether they will be together with something like 70% accuracy with just three minutes of silent videotape. You can predict actually whether someone's about to commit suicide. You can predict divorce rates of couples. You can predict whether someone is going to have an eating disorder based on their click patterns. You can predict, as you said, Yuval in examples of your own work, you can predict someone's sexuality before that person might even know their own sexuality. You can actually predict... IBM has a piece of technology that can predict whether employees are going to be quitting their jobs with 95% accuracy, and they can actually intervene ahead of time. And so I think one of the interesting things is when I know your next move better than you know your next move, and I can get not just underneath your skin, not just underneath your emotions, but underneath the future. I know the future. I know a future that's going to happen before you know it's going to happen. And it's like the Oracle in the matrix saying, "Oh, and by the way, Neo, don't worry about the vase." And he turns around and he says, "What vase?" And he knocks the vase over and she says, "Well, the interesting question is, would you have knocked it over if I hadn't said anything?" She's not only predicting the future, she's vertically integrating into creating that reality because she knows that that move is available for her.

Your Undivided Attention Podcast: Two M…

Yuval Noah Harari: And for the last 200 years or so, human feelings became the ultimate source of authority in ethics, in politics, in art, in economics. So the customer is always right is exactly that. And you have this big corporations that when you push them to the wall and you tell them, "You're doing all these terrible things, you're creating, I don't know, SUVs that pollute the environment." And the corporation would say, "Well, don't blame us. We are just doing whatever the customers want. If you have a problem, go to the customers and actually go to the feelings of the customers. We can't tell the customers what to feel."

Tristan Harris: And the same is true in Facebook if you say like, if people are clicking on those extremist groups or going into QAnon or clicking on hyper extremist content. "Why are you blaming us? We're just an empty corporation. We're a neutral mirror waiting for people to click on whatever they think is best."

Yuval Noah Harari: Even more than that, they have to... "Who are you to tell people what to click on?" They are presumably clicking on these things from their own free will. It's because they feel good about it. You're at some kind of big brother who thinks that you understand what's good for them better than them. Of course it's a manipulation because we know it doesn't work like that. And we know that not only today, also in the past, but especially today, humans have been hacked. And now when governments and corporations and other organizations have the power to manipulate human feelings, then this whole system has reached an extremely dangerous point. If the ultimate authority in the world is human feeling, but somebody has discovered how to hack and manipulate human feelings, then the whole system collapses.

Your Undivided Attention Podcast: Two M…