Is There A Problem?

Is there a problem?

Social Media – A social dilemma

Recently, I watched this documentary titled ‘The Social Dilemma’ based on a friend’s recommendation, and I admit while I have always struggled with the ethical aspect of a computer program and artificial intelligence-based models, it was with this documentary that I was able to understand the actual graveness of the issue. When an inventor tells you that his creation is flawed or being weaponized in ways that are not even conceivable by a majority, then there definitely is a problem.

Is There A Problem?
Image by Wikipedia

I have always believed that social media has a huge impact on one’s mental health. Anyone would be affected based on all the perfect and ‘real’ lives flaunted right, left, and center on social media. But I never quite understood how the tools – the social media platforms themselves are a part of the problem other than being a medium to pass on the information. To anyone trying the understand this in depth – I would recommend you to go and watch this Netflix documentary.

Humans being the real products

The problem is not that these tools exist, it is the way their business models have asked them to be. The goal of these models is to get your attention. So, they are just doing their job. They do what they are told to do and therefore, these tools have made some technological companies really wealthy as well as influential. But the question is where does all the money comes from? Advertisers are the major customers of these companies. But why do they give this huge amount to these social media houses? Well, they do so because of the products they get. And what are these products? Humans. Well, more specifically, their attention.

Grabbing one’s attention is a hard task. Advertisers need this attention so that they can get people interested in their products and get them to buy these products. These social media houses with their highly precise and well-trained models give them the required attention.

If you are not paying for the product then you are the product. Our attention is the product, advertisers are the customers.

– From, the Social Dilemma (Netflix Documentary)

Selling your attention

With all the technological advancement happening on a daily basis, grabbing someone’s attention is not as difficult of a task as one would think. Your YouTube or Netflix recommendations! Remember the last time you went down that rabbit hole? These models are designed to build such rabbit holes for you, and gives the best-performing rabbit hole. They are personalized to such an extent that you would feel empowered. In reality, it’s your behavior that these models can easily interpret.

What is even scarier? These models can even modify your behavior. It happens so subtly that you won’t even know that it happened to you. This is known as persuasive technology – using technology to change human behavior. This is one of the reasons why conspiracy theories and fake news spread like a forest fire in the social media circle. The models can predict based on your previous search history as to what content would make you stop and engage.

Is There A Problem?
Image by mohamed Hassan from Pixabay

The amount of disinformation one consumes on such platforms is scary. The Rohingya issue is considered to be fueled by a few selected groups on Facebook. The suspected Russian involvement in US elections, the one that Trump won, was again suspected to be fueled via Facebook. There was no hacking, there was just a spread of disinformation.

These conspiracy theories, fake news, disinformation – it all forces you to question your identity, your understanding of things, your self-worth. While we might laugh today at people who believe in flat-earth, it might just be that we are being fed a conspiracy of the same impact, and we are not even aware of it. COVID pandemic was one such situation where some people were convinced that it was just some Government propaganda and not actually a pandemic. Some saw it as a direct threat to democracy and freedom.

Who is to blame?

It’s not that while creating these tools, the engineers and scientists were aware of how their invention would be used as a weapon for instigation and challenging the human psyche at its most basic level. The machines cannot identify what the truth is and what a conspiracy is until and unless we collectively can come to an understanding of truth. But all the disinformation and person-specific news we consume, it’s hard to come to one commonality. It’s a vicious cycle that needs to be broken.

Is There A Problem?
Image by Thomas Ulrich from Pixabay

It is an existential crisis not because you liked that photo but because you were depressed because of the lack of likes. It’s existential not because you read a weird conspiracy online, it’s existential because a bunch of people consider it to be the truth of life and just went on a violent spree on the street next to yours. It’s happening every minute of every day. It’s an addiction, a drug, and it alters the way you think entirely.

What can be done?

There were no evil intentions, but there was a lack of regulations that led to a problem of such an extent. This lack of imagination that the inventions could be used in such a manner is why we need regulations. Dedicated groups who think of the ethical and moral aspect of such technologies that can alter the most basic beliefs and views of a person. The Interest of the users needs to be prioritized over these companies. We need to design the products, the tools, to be more humane.

It is not my intention to make you feel scared. But being wary is important. Being aware is crucial. Talking about it is a necessity. When you talk about it, you educate others and you also learn from others. Admitting that there is a problem is the first step towards curing it.

  1. Samantha Goodwill

    Very well scripted film and equally well worded article. Makes you wonder if this is how the machines will take over. We give them the power of information. they already have better computational and decision making skills. Is it a matter of time before The Matrix becomes a reality?

  2. Ajay

    > The problem is not that these tools exist, it is the way their business models have asked them to be.

    Any AI based model can’t work on data so data is money for all these social media platforms. The data comes from the FOMO and the imposter syndrome that this FOMO creates within ourselves. The second layer of this problem is ignorance of users. 90%-95% of users don’t know about the privacy settings in the app that they use. This allows these apps/ platforms to harvest user data and make their model smart by learning new things about the user.

    > There were no evil intentions, but there was a lack of regulations that led to a problem of such an extent.

    Regulations or no regulations (I assume you are referring to government created regulations here), the AI will always beat those intentions so the human species need self regulation and education. Only that can make the AI models behave constructively. If you are not aware of thee self regulations, search for “Digital Detoxification”.

    Quoting from @samantha’s comment:
    > …makes you wonder if this is how the machines will take over… […] … Is it a matter of time before The Matrix becomes a reality?

    Remember my believe in the terminator trilogy and you trying to dissuade me from it?

    1. Anubhooti Jain
        FOMO is a huge concern but where privacy is concerned, even with the privacy settings there are certain loose ends when it comes to data collection policies. Organizations doesn’t necessarily collect this data with an intention of selling it or so but rather to improve their models by feeding in more data. Information like your gender or the videos you viewed alone can tell a lot about what you like and what will keep you engaged – the primary goal of these models.
        By regulations, I mean people creating these AI models. AI models are intelligent and precise in certain aspects but useless in other aspects. It only cares or it only knows about what it is programmed to do. Its goal is to push content that keeps you engaged. And it does its job pretty well. So, it’s not AI per say but the engineers and organizations behind these models who needs to be regulated – could be Government set laws or self-regulation too. Taking responsibility is what matters.
        I still believe that we are far from Terminator. We have a wrong way of visualizing the AI. There aren’t any life-like robots or terminator coming out but AI is being used to target the way we think or consume knowledge which is scarier as AI is being used to pit humans against humans. Civil war is very much a possibility.
Leave a Reply

Your email address will not be published. Required fields are marked *