TheBorderLine: the gray territory of MrBeast challenges where YouTube & co struggle

TheBorderLine: the gray territory of MrBeast challenges where YouTube & co struggle

[ad_1]

Extreme, grotesque, more or less dangerous contents. Challenge, challenges with the lowest common denominator: money or other valuables. Become the central ingredient in convincing people to do or not to do a certain thing, a compete with an action that is not necessarily dangerous but perhaps apparently impossible or uncomfortable, to resist in a surreal or, precisely, nonsense situation. In ways that users are not given to know if and how they are perhaps organized. And then content in which, as has been the case for years a MrBeast25-year-old Jimmy Donaldson from Wichita, Texas, their inspirer and most-followed personality on YouTube with 160 million subscribers to the channel, gets involved in the first person.

TheBorderLine videos are still there: on YouTube, where they monetize, and on TikTok (where the platform has not been able to tell us whether or not they join the fund for creators and therefore receive payments) while on Instagram, the profile with almost 90,000 followers has been made private.

Monetization tools on YouTube are active: by clicking on the Super Thanks button at the bottom right under each video, for example, you can choose whether to pay them from 2 to 500 euros. And the subscription to the channel – from 0.99 cents a month to 99.99 euros – which gives access to stickers and badges reserved for paying customers also seems operational.

From YouTube and TikTok they explain that reviews are underway on the most controversial videos published by the collective of very young Romans involved in the accident that caused the death of a 5-year-old boy last June 14 in Rome. The challenge, in that case, was spend “50 hours in a Lamborghini” (rented for the occasion) as already done in the past for example with a Tesla.

On the other hand, the community rules both on YouTube and on TikTok are struggling to be applied to this category of content, always on the borderline between farce and reality, incorrect or simulated practices and real or presumed danger.

Let’s see and read some of the most controversial video titles: “How long can you last in the ice?”, “I survive 24h in the forest”, “Every 24h in isolation you earn €100”. And again: “I survive in the middle of the black desert (we risked)”, “I survive on a deserted beach (dangerous)”, “The last one to fall wins €500”.

I TheBorderLine they are not the first and will not be the last to produce content like thisclumsily modeled on some of the luckiest and most clicked by MrBeast or people like Trevor Jacob, who faces twenty years in prison for crashing an airplane on purpose.

And for some experts it must ask yourself some questions: “The tragic consequences of an extreme gesture committed for a handful of likes are really only the responsibility of the person who commits it, or is it time to wonder if there is no guilt by instigation who designs business models based on content monetization based on the number of likes and views per unit of time? – asks the lawyer Andrea Monti here on Italian Tech – and if the time has come to attack an unscrupulous use of the economy of likes, it will also not be the time to wonder, on the other hand, whether it is not time to allow platforms to autonomously block content or even predict dangerous user behavior? Translated out of all hypocrisy: Are we willing to sacrifice privacy and freedom of expression – i.e. individual rights – to allow for the protection of the community by accepting that private subjects take care of it directly?“.

The Rome Incident. The slavery of likes

by Claudia deLillo


YouTube policies

The problem is that even the “regulations” of private entities, as we said, struggle to intervene especially in cases where the tenor, style and type of content is not clearly identifiable or classifiable. In which the videos move in that gray territory between provocations, sarcasm, jokes, staging, more or less real or simulated challenges – whoever appears is paid or not, is he always consenting, does he understand or not what he is getting into? – and specially loaded tones.

For example, YouTube’s rules are very complex and touch on various areas: from spam to sensitive content to regulated products, disinformation and in particular violent or dangerous content: “Incitement to hatred, behavior aimed at soliciting, scenes of violence explicit, malicious attacks, and content that promotes harmful or dangerous behavior are not allowed on YouTube.” But to what extent can automated control first and human moderation later intervene on content that dances right on the border between real danger or fiction, risk emulation and blatant parodic or comical provocation? Nevertheless, even where the creators themselves signal that what they have done is “dangerous”, the video remains clearly visible.

Youtubers and challenges: those challenges chasing clicks and money: “The more extreme they are, the more they like them but it’s up to adults to contain them”

by Pier Luigi Pisa



TikTok guidelines

TikTok guidelines are even more complex. As for the type of content with which TheBorderLine were (indeed, still are) strong, there are different sections: safety and well-being of the youngest, mental and behavioral health and sensitive topics suitable for a mature audience. A paragraph is dedicated to “challenges and dangerous activities”. of the second section in which we read with some analytical ability that “most of the activities or challenges are suitable for everyone and are intended to bring people together, but some may involve the risk of serious injury. We do not allow challenges and dangerous activities to be displayed or promoted. This includes challenges, games, tricks, the inappropriate use of dangerous tools, the ingestion of substances harmful to health, or similar activities that can cause significant bodily harm.” According to these indications, however, a good number of clips proposed by TheBorderLine should not be visible on TikTok. In the meantime, however, even above, followers are increasing: on the morning of June 16, at 10, they were 267 thousand and 200, a couple of hours later exactly a thousand more. As happened to the private profiles of some of the young people involved .

TikTok adds that “content is age restricted if they show activities that can be imitated and which can lead to physical harm”. And to verify this, it would be necessary to understand whether some of their videos, like similar videos of others, are really obscured to accounts formally linked to minors. Not only that: the Chinese platform adds that “Content is not eligible for the For You feed if it involves activities that can lead to moderate physical harm or if it includes professionals engaging in extreme sports and stunts that can endanger other people.” Yet it’s hard not to think that much part of that traffic – when it goes badly the videos stop at 30/40 thousand views, when it goes well they take flight to several hundred thousand – is not also fruit of the thrust of the platform’s formidable algorithm.

“From direct manipulation, the next step – and it is the one that characterizes the like economy more than others – is precisely that of creating a system that a priori induces people to behave “freely” in order to maximize the effects of their shares – adds Monti – we certainly cannot say that the xyz platform is directly responsible because the user tiziocaio35 went into an ethyl coma for having drunk disproportionate quantities of spirits.However, applying with a little courage an old civil code provision on obligation to “do no damage” and the criteria of the penal code on complicity in the crime, we could arrive at some evolutionary solution that also involves people who take advantage of a system designed to intoxicate users”.

[ad_2]

Source link