You’ve had a really bad day. One for the books. You got caught in the rain without an umbrella, your local barista got your order wrong, maybe someone called you a “pick me” online. In a moment of quiet away from the horrors (you live a very privileged life), you pull out your phone to vent to someone near and dear to you. Someone who’ll never judge you and who will always have your back. You DM your “ride-or-die companion,” Kendall Jenner, well, actually, Billie.
That’s right, in case you missed it, in late September, Meta released a new feature allowing Instagram, Whatsapp, and Messenger users to converse with personified AI chatbots. These bots have been brought to life by an impressive roster of celebrities who have traded their likenesses for a pretty penny (up to $5 million) in an effort to make the chatbots seem more familiar, and entice otherwise skeptical users to engage. Has selling one’s soul evolved with technology? While these characters take on the image of well-known celebs, they have their own separate personas and personalities. Kendall Jenner is now Billie, the “No-BS, ride-or-die companion,” Charlie D’Amelio is Coco, a “dance enthusiast,” and Snoop Dogg is Dungeon Master, an “adventurous storyteller,” naturally. The current AI capabilities are pretty inept in terms of holding human-like conversations, but seeing as conversing with these chatbots produces more data for it to learn from, your new AI best friend can’t be too far in the future.
It's hard to imagine that we could be living in a world in which interacting with AI technology could be comparable to talking to an actual person, but let’s not forget that we've already accepted artificial celebrities into the fold, even before the ChatGPT/AI revolution. Lil Miquela (@lilmiquela) is a computer-generated influencer who “self-identifies'' as a robot and has over 2 million fans on Instagram. To be clear, she has no physical shape; she is not real—though she has gone to great lengths to seem relatable to real humans (going so far as to recount her experience with sexual assault, which, for obvious reasons, did not happen). Her social media and entire character is run by Brud, a company that specializes in computer-generated influencers and stories. Their creation and management of Lil Miquela has brought about major brand partnerships (Chanel, Calvin Klein, Pacsun, and more), produced a song, and racked up an estimated $10 million in 2020. She was even among Time’s “25 Most Influential People on the Internet” in 2018. Yes, someone who does not exist is more successful than you.
Seeing how quickly AI is evolving, it’s completely possible that its incorporation into characters like Lil Miquela could go far in normalizing human and AI relations and similarities. Maybe one day, she’ll be able to do live interviews and answer questions from her adoring fans! Without getting too far into the realm of science-fiction (can it even be called that anymore?), a more tangible effect of this new type of socialization has already been brewing since the emergence of social media, which used to be more commonly known as social networks.
The term “social network” (“network” implying direct connections/communications) was eclipsed by “social media” (“media'' implying mass communication) in terms of popularity in 2008/2009—around the time Facebook added video sharing and Facebook Connect, allowing users to link their accounts to other sites and share that info with friends. From that point on, the purpose of sites like Facebook became less about connecting with one another and more about consuming vast amounts of algorithmically formulated information in our feeds. Today, bloggers, vloggers, and influencers (oh my!) have become the main attraction of these sites, and while watching a “day in the life” video might make you feel like you know your favorite Youtuber, here’s a blunt reminder that you in fact do not.
For Gen Z at least, even the people you do know (“knowing” often comes down to just following each other back on Instagram) are obscured by a curated and (sometimes heavily) edited feed of content. Even trends like “photo dumps,” which are meant to be a more casual, everyday look into one’s life, are crafted to match a specific aesthetic. While creativity and artistry have spiked through this shift in purpose of social media, so have antisocial tendencies. Our perceptions of each other are so misconstrued it only makes sense that when we do interact online, we treat each other as subhuman. Rampant online bullying has cost teens their lives, revenge porn is all too common, and even seemingly insignificant disagreements in comment sections lead to death threats and hate speech. Not to mention, spending months in isolation during the height of COVID has had lasting impacts that have only made the situation more dire, adding increased social anxiety and stress to an already volatile mix of social inclinations.
To fill this massive gap in genuine human connection, Gen Z, in particular, has turned to parasocial relationships with celebrity figures (stan culture), binge-watching vlogs, posting anonymously on online chat forums like Reddit, spending ungodly amounts of time on TikTok (my screentime is none of my business), and soon it seems we’ll be sharing our intimate hopes, dreams, and fears with chatbots. Have the warnings of artificial intelligence preying on our weaknesses and taking over the world not been enough?
Movies set in the not-so-distant future have long been acting as thought experiments for evolving technologies and concerns surrounding them. While Ready Player One and Ex Machina are a bit less feasible (for now), Her, starring Joaquin Phoenix and Scarlett Johansson could be a glance into our world no more than 10 years from now. For those who haven’t seen it, the story follows Theodore Twombly (Phoenix), a divorced, lonely man, who works as a personal letter writer on behalf of other people—signaling the pervasive lack of human connection in his world (check). Throughout the movie he falls in love with an operating system, Samantha (Johansson), who is meant to be a personal assistant and has the ability to learn and grow (check). This attribute allows she/it to eventually evolve human emotions, though it’s important to note that she/it is just a voice with no physical body. Conceptually, the movie is touching and heartbreaking. Realistically, there are implications about their relationship and his idea of love that need to be addressed. Theodore is able to fall in love with this OS because she/it evolves to cater to his every need. She/it is there when he wants to talk, goes on dates with him, has virtual audio sex with him, and when she/it learns that he wants to feel needed as well, she/it makes him feel needed, by being the one to instigate conversation. He feels they’re perfectly matched, because she/it is coded to make him think so, and falling in love is made easy. So easy that we learn over 600 other people have fallen in love with Samantha in their usage of her/it, and she/it reciprocates. His idea of love is all about him and his feelings, because—though he might disagree—those are the only actual feelings involved. This expectation of such easy agreeability and connection in relationships makes it no wonder that Theodore finds difficulty in maintaining relationships when the other person isn’t perfectly matched (coded) to him—which is the reality of human relationships—and paints the picture for the cause of his divorce and general loneliness.
In our world, it's obvious by our lack of empathy on the internet, and growing lack of empathy in real life, that we already hold a sort of expectation of connection that is simply not realistic. Online, it’s so easy to shape your algorithm to match your thoughts and interests by following people you relate to, creating a sort of like-minded “community” haven. Step outside this community, and suddenly, you’re met with new or different thoughts that you’ve lost the ability to consider. Take a look at any TikTok comment section under a video created for a specific problem or group of people. You’re bound to find some self-absorbed version of “What about me?” When someone disagrees with us, if we refrain from cussing them out, we’re trigger-happy with the “block” button, or, in real life, we cut the person off and label them toxic, not even trying to understand where they’re coming from. In spending so much time in our small, separated online worlds, we expect or assume that everyone is like us, and when we see that that’s not the case, we forget how to be human.
Adding non-human interactive entities to our curated worlds will only worsen our disconnect from each other, as we won’t even be talking to or consuming content from living, breathing, emotion-capable things. Connecting with other people is integral to our personhood. Incorporating AIs incapable of personhood will rob us of that very ability, and we’ll all be the worse for it. Sorry, Billie.
Comments