The AUC as social networks regulate their content like conventional platforms

They will have seen nothing that moves a little on the internet. Adding to the frequency of all kinds of fake news and covert advertising is a new stream of 'influencers' who glorify cryptocurrencies and promise their often very young audience a life of luxury and sleep almost without moving a beat. finger The truth is that the matter is already reaching epidemic levels. An epidemic in which the Association of Communication Users wants to set limits, in order to protect minors from harmful and inappropriate content and to also defend the interests of consumers and users against illegal commercial communications.

Their proposals to put an end to this anything goes that seems to flow through the internet, now that the new General Law of Audiovisual Communication is in full parliamentary process, is that platforms and social networks such as YouTube, Vimeo, Twitch, Instagram, Tik Tok, Facebook or Twitter adhere to the same rules to which they are subject to linear television, which have specific regulations regarding commercial communications and are obliged not only to rate the content they broadcast by age, but to broadcast adult content only in certain time zones.

In the same way, they regularly request the figure of content-generating users, adjusting to those same obligations in relation to minors and advertising. "You have to keep in mind that their followers, especially among minors and young people, exceed the audience of many television programs," says the study.

“The issue is difficult because two regulations have to be reconciled, which are the Information Society Services Law and the General Law on Audiovisual Communication, but I think that almost everyone understands that the objective is that citizens should have the same level of protection, regardless of where you decide towards a content. It cannot be that I see the same content on television and on the Internet, and in one case it is protected and in another it is not. From there you will find the most realistic way to do it”, explained Alejandro Perales, president of the Association of Communication Users.

Its conclusion has been that around 4.000 audiovisual contents have been analyzed, between programs generated and distributed for the platforms themselves and videos generated for our users, in a study that is especially focused on influencers. In any free access by minors to inappropriate content, reports revealed that in general only 1,1% of the content analyzed have some kind of sign or warning of age and that in the case of harmful only 5,5% have these warnings Those signals, reveal the work, concentrating on video platforms, but "virtually do not exist in social networks." He further highlights that although these platforms rarely host pornography or extreme violence, their access for minors remains "total" on the internet.

Regarding advertising, it informs the public that a third of its advertising and promotional messages have detected its commercial communications and that it is recorded mainly among its influencers -in 84,6% of its cases they are part of videos generated by users-. He also complains about the association, about the advertising saturation to which viewers are subjected. In this case of the programs distributed by the platforms, 37,4% of the content presented four or more advertising breaks for every 30 minutes, something that, in addition to increasing the invasive perception of advertising, "undermines the integrity of the content" Perales explained. In this case of social networks, we analyzed nearly 2.000 contents in five 5-minute sessions. Based on these sessions, in 84,6% of the videos interspersed advertising is detected and in 44% of them, commercial communications account for between 25% and 50% of the content of the session. Also in terms of advertising and promotional formats, platforms and social networks, they will benefit from the lack of regulation due to television restrictions. Thus, in 73% of the sponsorships there are direct messages encouraging the purchase and in the brand placements in 100% of the cases there are no signs or warnings and once again there are direct messages encouraging the purchase.

But there is more, it is easy to see, for example, how health products are offered without scientific evidence or authorization, alcoholic beverages covertly or showing their intake by those responsible and the guests of the programs, even with high-quality products. graduation. Tobacco, self-promotions or medicines also have their space in the network of networks. It must be said, yes, that after the approval of the Royal Decree for the development of the Gaming Law, commercial communications of games and bets have disappeared from platforms and non-specialized social networks, although there is some occasional presence 0,2%.

The last point in which the report does a lot is in commercial communications directed especially to minors. At this point, the association has seen direct incitement to minors to purchase in 8,9% of advertising messages and highlights "cases of very aggressive advertising." They also focus on product recipes by influencers "that exploit the trust and credulity of minors" by encouraging them to purchase and minors' access to aesthetic content that "impose strict and exclusive canons of beauty" as well as communications of high-fat products. In both cases, television stations have rules that restrict access to minors.

Thus, it is clear that the parental control systems that are implemented from home do not work well at all. “They have two problems. Many of them are based on terminology and the terminology is very misleading. What happens is that in some cases they go further, blocking content that should not be blocked, and in others allowing full access. It happens with pornography, they respond to certain words by blocking, but other more metaphorical terms perfectly pass any filter”, explained Perales. "We believe that what works, in addition to double verification systems to know the identity of the user and determine if it is a minor or not, is the qualification of the content as a step prior to its storage and dissemination, because it allows for a harmonized scale with criteria that everyone uses that are similar and that allow parental control to work automatically”, he concluded.