Date in Portugal
Clock Icon
Portugal Pulse: Portugal News / Expats Community / Turorial / Listing

“We have never had such access to knowledge and science so discredited.”

“There is a certain illusion among people, even in the debates I participate in on this topic,” says the author of the book Algoritmocracy, which is going into its second edition. “They claim: no, I am not just confronted with the politicians I like, it’s not true that I’m encapsulated in a bubble because I constantly see content from the other political side.”

The author questions whether these contents from the opposing political side are sensible, reasonable, or sheer nonsense, as noted by the partner in the Public Law area at Pérez-Llorca.

“Being in a bubble doesn’t mean hearing only the politicians we favor and align with. It also involves hearing the more caricatured, the more radical side of the opposition to the extent that we build an image of the other side as almost illegitimate because they say so many nonsensical things that it’s impossible for a reasonable person to support such ideas,” he explains.

Adolfo Mesquita Nunes suggests that “people often do not realize that the bubble is not about seeing homogenous content all the time.”

The bubble, he says, “is about clustering groups of people who are resentful, fearful, or who have certain sentiments about the same topics,” comments the former Secretary of State.

This is where AI comes in, “because the algorithms, with the knowledge they accumulate about us” – given that the more time spent online, the more data is collected – “create millions of correlations.”

“By making statistical correlations, they determine that if someone likes watching videos on healthy eating, they may also like videos on the dangers of certain chemicals; and if they are interested in videos on the dangers of chemicals, they might enjoy videos about natural medicines and skepticism towards science and medicine,” he exemplifies.

Thus, if someone is inclined toward videos on healthy eating, “they can statistically, due to algorithmic correlations – as shown by studies – begin to access content that they never requested but that will shape their opinion.”

This is why “it is evident today that there is a certain regression in the way we relate to science,” Adolfo Mesquita Nunes observes.

“Never have we had such access to knowledge, and yet science today is so easily discredited,” he laments, pointing out that “an anti-vaccine influencer has as much or more voice than a doctor explaining both the side effects and the beneficial effects of vaccines.”

It becomes “concerning when falsehoods and truths circulate side by side or when the algorithmic ecosystem itself promotes falsehoods,” he warns.

However, this doesn’t happen because the algorithm knows it is false, since it is not a person and lacks that capacity.

“It simply observes that this content gets more clicks and, therefore, continues to promote it,” he explains.

Indeed, “if the falsehood, miscontextualization, or half-truth is more enticing, that is what the algorithm will favor, not knowing it is false, only because it is the one with more ‘reach,'” summarizes Adolfo Mesquita Nunes.

The current regulation in the European Union does not address this aspect, “it addresses many other important areas, [but] not this one.”

Leave a Reply

Here you can search for anything you want

Everything that is hot also happens in our social networks