Date in Portugal
Clock Icon
Portugal Pulse: Portugal News / Expats Community / Turorial / Listing

“We have never had so much access to knowledge and science so discredited.”

“There is a certain illusion people have, even during debates I participate in on this topic,” remarks the author of the book Algoritmocracia, now heading into its second edition, “in saying: no, I’m not just faced with politicians I like, it’s not true that I’m getting encapsulated in a bubble just because I constantly see content from the other political side.”

These contents from the other political side, are they sensible, reasonable, or absolute nonsense? This is the question posed by the Public Law partner at Pérez-Llorca.

“Being in a bubble doesn’t mean only listening to politicians we like or from our political side. It also means hearing the more caricatured side (…), the more radical side of the opposing political spectrum to the point where we begin forming an image of the other side as almost illegitimate, because they speak such nonsense it’s impossible any normal person could support such an idea,” he notes.

Adolfo Mesquita Nunes explains that “people often don’t realize that the bubble isn’t about always seeing homogeneous content.”

The bubble, he says, “is about grouping clusters of people who are resentful, who are afraid or have certain kinds of sensations about the same topics,” says the former Secretary of State.

This is where AI comes into play, “because algorithms, with the knowledge they acquire from us” – as more time spent online generates more data – they make “millions of correlations.”

“To make statistical correlations and say if this person likes watching videos about healthy eating, they might also like to watch videos about the harmful effects of certain chemicals, and if they like to watch videos about these harmful effects, they might be inclined to watch videos about natural medicine and skepticism towards science and medication,” he exemplifies.

If someone enjoys videos about healthy eating, “it can statistically, due to algorithmic correlations—and studies demonstrate this—cause us to start encountering content we never requested, but that begins to shape our opinion,” he elaborates.

This is why “it is evident today that there is a certain regression in how we relate to science,” Adolfo Mesquita Nunes considers.

“We’ve never had so much access to knowledge, and today science is so easily discredited,” he laments, pointing out that “an anti-vaccine influencer has as much, if not more, voice than a doctor who explains the side effects, as well as the benefits of vaccines.”

It becomes “concerning when lies and truth circulate side by side or when the algorithmic ecosystem itself promotes falsehood,” he warns.

But this doesn’t happen because the algorithm knows it is false, as it is not human and lacks such capability.

“It simply notices that the content gets more clicks and, therefore, continues to promote it,” he explains.

Therefore, “if lies, or decontextualization, or half-truths are more enticing, those are what the algorithm will prioritize, not realizing they are false, simply because they are achieving more ‘reach,'” Adolfo Mesquita Nunes summarizes.

As for the current regulation within the European Union, it doesn’t cover this area, although “it covers many others where it’s important to be involved, [but] not here.”

Leave a Reply

Here you can search for anything you want

Everything that is hot also happens in our social networks