TikTok makes diagnoses and learns the sexual orientation of users before they do. This is not a question of surveillance and identity theft: this is how the site’s algorithms work. It was they who made the social network successful. We will tell you how they work, and whether it is possible to “subdue” them.
It’s not the likes that matter, but the viewing time
As The Wall Street Journal found out, the algorithm pays more attention to how long the user has watched the video than to tik tok hearts and comments. Everything is logical: if a person has watched the video to the end, it means that something has hooked him. As a result, the user’s feed begins to fill with just such content – not “liked”, but reviewed to the end. Soon, 93% of recommendations are made up of videos like this. The remaining 7% are TikTok ads.
The main thing is good content
If visibility is the main criterion for TikTok, then the only way to register in the recommendation feed for a long time is to create high-quality content that they want to watch.Just buy followers for tik tok and be famous.
This guess was confirmed by the WSJ journalists. They have registered over a hundred accounts in the application. Everything is in accordance with the requirements: indicating the place of residence, date of birth, hobbies.
Account management bots have watched thousands of videos without leaving any reactions. In just two hours, the TikTok algorithm identified the bots of all bots and filled the recommendation feed with relevant content.
Rabbit hole with alcohol and anorexic
At first glance, this insight of the algorithm benefits everyone: some get good and interesting content, others get free views. In fact, everyone “suffers”.
Content creators have to work “on the algorithm”. If TikTok rated one video as being watched, the same author will promote similar videos in the first place. Moreover, the type of content will be more important than its creator.
Users, on the other hand, fall into the so-called rabbit hole: from light commercials, people reach serious and sometimes even dangerous ones. So, the bot, which was prescribed sex in its interests, “failed” in the topic of BDSM, and the tape of the depressed bot was filled with videos exclusively about mental health.
The further the bots plunged into the content funnel, the more dangerous their content became: videos about suicide, overdose, anorexia appeared – everything that usually does not pass moderation.
You can get out of this rabbit hole only by dramatically changing interests. Alas, real people, not bots, are usually not capable of this and fall under the influence of content that they do not even choose.