Algorithms take the wheel: FOMO in the age of personalized content

Image source : Your Charisma

Image source : Your Charisma


Raise you hand if your conversation with your most recent Tinder date fell short after only a few minutes because you both realized you (and everybody else for that matter) were consuming the exact same type of content on social media, Netflix and the like. I know I have. To be honest, I didn't think too much of it at first, not until one of my friend fessed up to be using the 1.5 speed feature on all of the content they consumed to squeeze in as much watch/listening hours as possible. It deeply echoed my own former self that needed to be on top of every single pop culture golden child of the time that the masses deemed worthy to occupy the top spots on social media trending pages. And that's when it hit me: Collective FOMO (Fear of Missing Out) had become a problem. For the longest time I left it at that, but you see, me binge watching 3 shows at the same time for fear of not being able to make small talk in social settings was only made possible in the first place because of algorithms personalization and how they're used to lull us in a false sense of placating our FOMO when really, they are just enabling it. Bold claims, I know, but hear me out. According to the Cambridge Dictionary's definition, algorithms are neither good nor bad per se.

 
Algorithms take the wheel
 

However, they can become problematic when built to achieve what we call nudging marketing, a practice used to deliberately manipulate the choices presented to the consumer (think Google Search Engine results ranking and exactly why some results are pushed onto the first page over others, it's no coincidence). Because of their inherent “neutral” nature, most algorithms don't have ethics embedded in their code which can lead to wrongly opinionated or downright false information being relayed to the user as the results are simply made to reflect what information is available on the internet.  So... Yeah, technically not bad but definitely not good either. 

These nudging marketing practices aren't endemic to Google's Search Engines and are also heavily used through the algorithms used on social media platforms or streaming services such as Netflix or Spotify. This means that the content recommended through these algorithms (that is seemingly fitted to our tastes) can also have a clear bias, because remember, social media platforms are run by CEOs of big tech companies who have their own private agenda. This, is how this awesome TV series that you thought was recommended to you organically from your overall great tastes, is just what white dudes in suits sitting in board rooms pitched as what content would be susceptible to make them the most money, fuel your lunch convos with your co-workers and trap you into a FOMO hell as the new trends cycles are renewing themselves ever so quickly. 

YouTube is no stranger to the practice either, and I'm sure you can think of a couple of recommendations that were made to you that fall into that category. My personal ‘ah-ha’ moment was in 2017-2018 when Youtuber Emma Chamberlain rose to fame on the platform. For some context, I'm a 26 years old cis woman based in Paris and I had a hard time understanding why YouTube seemed to recommend me relentlessly this teen (emphasis on teen) and her vlogging adventures in San Francisco (at the time). I clicked on the video and enjoyed myself for 10 minutes before clicking off and not thinking anything of it. Until, 2 weeks later, her videos were still recommended to me and had reached millions of views (and subsequently millions of dedicated subscribers). I was perplexed regarding this phenomenon and it seemed I wasn't the only one as other commentary Youtubers questioned why and how Chamberlain seemed to have been recommended to seemingly everyone that possessed a Youtube account. The demography to whom her videos were recommended to was so broad that it seemed that it could have only been intentional for her to succeed and thus, was born the term ‘Youtube plant’, as it was believed that the YouTube algorithm had pushed her videos onto users regardless of their interest in the genre to begin with. This year, the same phenomenon seemed to have happened yet again over the summer with the “Van life girls” video category that popped up out of nowhere (with that really sweet girl Jennelle Eliana who got to 1 million subscribers with all of 3 videos in an age of saturated content and whose channel was in everybody's recommended when none of us had an interest in the topic in the first place, hello???). 

However, these repeated acts of nudging can contribute in the long run to the loss of individuality, when everything you're exposed to is the result of mathematics computation of what is likable for your demographic instead of genuinely expand your tastes and thereupon your mind.

Algorithms are found in most things we consume these days especially for entertainment purposes, which can turn out to be detrimental as they rule over our downtime, when our defenses are low and we are more easily vulnerable when exposed to corporation’s agendas and product placement. Algorithms are shaping human behaviour and right now, the ethics surrounding them are wobbly at best but most importantly left unchecked. 

It's also just weird to realize that all these platforms are run by the greed of corporations that have their own agendas at the forefront of their minds while feigning to have found a way to make our lives easier. When it comes to regulating the internet and accountability of big players, it can get tricky and some lines can be blurry but I'm sure everyone can agree on a couple of universal principles to teach these algorithms: nazis are bad, cats and dogs videos are cool.

by Muriel Vincent

FOLLOW US !

SOCIAL MEDIAMuriel Vincent