Dangers of social media algorithms
Dangers of Social media algorithms, are they even real? Do we still have control over our own lives, or do algorithms determine who or what we are? Are social media algorithms really a big threat to our identity?
A while ago I saw the movie “The Social Dilemma”. https://www.thesocialdilemma.com/ Did you see this movie too? If not, do it! It is a chilling film about how the algorithms of social media take away your free will. How you become a slave to algorithms that determine for you what you like, what you buy, how you think and much more.
Social media algorithms, artificial intelligence, and our own genetics are among the factors that influence us outside our consciousness. This brings the question, do we still have control over our own lives?
Everyone watches a movie on YouTube or Netflix because you clicked through from a link on social media!
You know, those messages you get about which movie or documentaries we should watch on YouTube, or Netflix. Who doesn’t click on them, at least once? Or who doesn’t add a friend to your Facebook friends because Facebook introduces that friend as someone you might know! Have you ever thought about how Twitter decides which tweets go to the top of your feed?
Social media drives by profiled algorithms
Social media drives by profiled algorithms, from our history of likes, clicks and interests. We see that posts what they are sure we will act on. If we take this action, then the algorithm is successful and this is how these companies earn their billions. The better they profile us, the more money they make.
Every minute of every day, we are victims of the immense manipulation of dangerous social media algorithms. These social media algorithms are a danger to the development of our own identity. Google is also guilty of this, or think of the profiling that certain algorithms do for the police, lawsuits, taxes, you name it! Taxes who decide, on algorithms that people commit fraud, it is all happening! Following these algorithms has caused thousands of people a lot of suffering and misery.
When we make decisions, based on profiling, are we making them freely?
When we make decisions based on profiling and the resulting algorithms, what does that mean for our right as human beings and ability to make decisions freely? We need to realize that we are victims of profiling and indoctrination every minute of every day! We no longer consist of our own information and thoughts, but algorithms determine our behavior, thoughts, feelings and thus our selves! The development of our own identity is in danger through social media algorithms. Opinions forced upon us with great force and get manipulated through to the extreme. We are far too little aware of this, and the thought is terrifying!
Cambridge Analytica, the company involved in the biggest Facebook data leak, claimed it was able to profile your psychology based on your likes. These profiles are worth their weight in gold and companies are buying them up. But think of the elections, anything that involves profit in any form can be a reason to use profiled algorithms.
Although this data breach had a major impact on Facebook, it was only very temporary. Many people deleted their profile on Facebook and their ad sales decreased drastically. But they undoubtedly used all kinds of proprietary algorithms to make us forget about their big scandal quickly. Because Facebook’s ad sales are more profitable than ever. Facebook’s power is higher and more powerful than ever before!
Cookies are small pieces of data that follow us around websites
“Cookies” are small pieces of data that follow us around websites. They are records of actions taken online (such as the links we click and pages we visit), who get stored in the browser. Combining cookies with data from multiple sources is called “data enrichment”. It can link our personal data such as email addresses with other information such as our level of education.
This data regularly used by tech companies like Amazon, Facebook, and others for building profiles of us and predict our future behaviour.
So we as humans are nothing more than predicted behaviour, how scary is that!
How much of your behavior is predicted by algorithms based on your data? Using data from Twitter, researchers investigated how predictable people’s tweets are. This was done by using only the data of their friends. This research showed that data from eight or nine friends was sufficient to predict someone’s tweets with well over 50% accuracy. A machine learning algorithm can make the same prediction with 95% accuracy also using data from your friends. Shocking, isn’t it? Research has shown that even if you delete your social media, you are still a victim of profiling because of the social ties that remain online.
What exactly is an algorithm?
Here’s a little summary of the facts described above, so that this really gets to us! An algorithm is a kind of digital thesis. It is a list of rules to achieve an outcome, using a set of ingredients. With the results of such an algorithm, tech companies can make massive money. An algorithm profiles us and shows us all kinds of products and news feeds that, according to that profiling, interest us the most. Because we see this again and again, these posts condition us to buy things or to keep on scrolling, so that we see more and more advertisements.
The more ads we see, the more we unconsciously click and buy. The richer these tech companies become. As a result, we think less of ourselves! So there is a danger that our identity is no longer based on our own development and determination, but on social media algorithms.
The New York Times podcast Rabbit Hole investigated how algorithms could lead us to radicalism. On YouTube, the recommendations algorithms give to YouTube viewers can drive you to increasingly extreme content, leading to online radicalisation. Algorithms influence us to a great extent, even if we are completely unaware of it.
Social media is also guilty of emotional contamination
Research has also shown that Facebook’s “News Feed” algorithms rank content in order to keep us scrolling for hours on end. This phenomenon, called “emotional contagion”, whereby seeing positive posts prompts us to write positive posts ourselves. While seeing negative posts prompts us to make negative posts ourselves. In other words, the algorithms written by these tech companies do influence our emotions. In this way, we more or less turn into slaves of the society, driven by Artificial Intelligence. There are scholars who see AI as the downfall of human society. I see the signs already, do you?
Dark patterns designed to entice us to share and spend more than we want to
So-called “dark patterns” also designed to trick us into sharing and spending more on websites like Amazon. These are tricks played by website developers, such as hiding the unsubscribe button, or showing how many people are buying the product you are looking at.
A few years ago, the website booking.com got fined heavily by the EU for this. This was due to mentions that at that particular hotel you were looking at, a number of people were looking at the room in question at that exact moment. This is seen as negative emotional influence, you develop fear that the room will be occupied for that good price, and you book faster.
The dating site “Meetic” is also a great negative example of having “dark patterns”. Once you have bought a subscription, it turns out to be almost impossible to get rid of it. The many actions that you have to take, in order to do so, are so extensive and so confusing that people quickly give up. Approaching customer service for this often does not happen out of fear of a possible feeling of stupidity. So all this is a serious form of customer deception. The “dark patterns” steer you unconsciously in the direction of actions that the site would like you to take.
Shadow profiles built from non-users
We also discovered the possibility of creating “shadow profiles”. These are profiles of non-users built from their contacts present on certain platforms. When you never used Facebook, but your friends do, the possibility can arise that a shadow profile is built up about you as a non-user.
On platforms such as Facebook and Twitter, privacy is no longer linked to the individual, but to the network as a whole. Shocking, isn’t it?
Social media can be a threat to global democracy
As we saw in several elections all over the world, tech companies influenced social media algorithms to such an extent that prefered runners won the elections. Social media channels are designed so that if they get into the wrong hands, they can be used to destabilise a country or change the outcome of an election. It can incite hatred and crimes against groups, countries, faiths etc. We saw for example the power of social media with the storming of the Capitol in the final days of President Trump’s administration.
However, the question now is whether it is right for Twitter and Facebook to close President Trump’s accounts because of the messages President Trump posted? After all, it is a human right to express ourselves as we wish! It is everyone’s own responsibility how and when to react on posts?
Read our blog again on human rights education. https://xcodexfoundation.nl/blog/why-human-rights-education-is-so-important/
X-Codex is fighting for freedom of speech, but with morality and respect. Now, if you write something negative on a blog about vaccinations, and you post it on social media, it won’t get anywhere in the newsfeeds. These companies are therefore appropriating the power to determine what we express or not, and that is against our human rights. This never can and must not be allowed! But on the other hand, there are also many abusive people who manipulate the gullibility of mankind with all kinds of negative messages. So where do you draw the line? Should we or should we not be protected in this?
Should we let stinking rich tech companies dictate to us what we can and cannot write?
No, we shouldn’t. This is a form of oppression of the highest order! This is also the reason why we at X-Codex will make minimal use of mainstream social media platforms, besides using predictive indoctrinating algorithms.
Just Think! Think for yourself!