Post a Comment Print Share on Facebook

Why should we care so much about Pegasus when we are constantly being watched?

A curious phenomenon has been occurring in recent months: The constant news about the hybrid threat being exerted either by certain governments or the Pegasus instrument has made many citizens distrust technology and question whether they are running unknown risk.

- 10 reads.

Why should we care so much about Pegasus when we are constantly being watched?

A curious phenomenon has been occurring in recent months: The constant news about the hybrid threat being exerted either by certain governments or the Pegasus instrument has made many citizens distrust technology and question whether they are running unknown risk. They also seek out information on how to protect privacy in an environment that is so hostile.

We continue to be amazed at how these citizens make day-to-day decisions that are not too reckless in terms of privacy, data protection, and guaranteeing their rights and freedoms.

This is known as the privacy paradox. The privacy paradox is a situation in which technology users are concerned about their privacy but their daily decisions do not reflect this concern. These decisions often place sensitive data at the fingertips of third parties such as banks, media companies, telecommunications operators and administrations.

This lack of coherence can be explained first by ignorance. Citizens cared about privacy and protected it. However, they couldn't make decisions in accordance with their concerns if they didn't understand the risks involved in certain decisions, such as installing an app, registering on a platform, or giving consent.

Both the media and various consumer associations, users associations, personalities, academics, and others have contributed to this increase in awareness over time. They have helped to increase awareness through various publications and activities. More information is available on data-driven business models and economics of surveillance. We also discuss the risks involved in making certain decisions. Why hasn't technology usage changed?

New explanations for anthropology, psychology, and sociology emerge at this moment. It seems that the average citizen is rational and can make a calculated decision. He weighs the risks he takes against any benefits he receives.

Except for the rare cases where users are extremely jealous of their privacy this calculation tends towards the application everyone has already installed. That platform with so many features or that consent that will allow you to access that service, supposedly gratuitous, that you were so eager to use.

All agents involved in surveillance economy are aware that users can be subject to contradictions. They know how to exploit these contradictions. Users are encouraged to believe they have control over their data and are in control of it.

It's intended to appear unimportant so it is often hidden behind (dubious), informed consents and endless privacy policy and terms of usage that no one ever reads. If you don't read them, you may be impatient or even unconscious. Perhaps you don't have anything to hide, and you trust the provider who will be processing the transaction. What non-legitimate interests could they have in your data, if any? You will see the irony.

The perceived risk must be weighed against the perceived benefit. If I had to give up the many benefits offered by a large technology company, a newspaper or bank, or streaming platform, I would be a fool, paranoid, or a conspiracy theorist. What I give up in exchange, I don't see. Nor what they do with it later. It may be suspected in some cases; it may not in others.

The interfaces, pages, and tools are all designed to give us that extra push. These are the dark design patterns, or dark patterns, which show us options that have been set as defaults. They make it easy to sign up and difficult to unsubscribe. Or they don't show us the credit card number until the very last moment. Just as we were about to taste the caramel...

Do we accept surveillance? Do we surrender to the surveillance? It was once advocated that citizens could choose whether to pay for technology with their personal data or with money. This was similar to what was done with traditional products. This poses a risk: privacy can become a luxury that is only available to those who have the means.

We are currently working on two different fronts. The first is to change business models so that companies that receive direct or indirect marketing benefits with our data have transparency and incentives to move to ethical models that respect users' rights and freedoms.

Citizens must be empowered to make decisions and be in control of them. They must be taught how to become digital citizens and aware of their rights as well as the potential consequences of their decisions. They must also be equipped with the tools they need to make informed decisions. These include trust seals and graphic labels, privacy-friendly configuration automat mechanisms, non-compliance monitoring systems, access and transparency panels and consent receipts. (Our last publication only assumes that there have been significant advances in this area).

The only way to make these tools more widely available and accessible to all citizens is for those who benefit from them to feel the need to act differently.

This article was published in "The Conversation".

Avatar
Your Name
Post a Comment
Characters Left:
Your comment has been forwarded to the administrator for approval.×
Warning! Will constitute a criminal offense, illegal, threatening, offensive, insulting and swearing, derogatory, defamatory, vulgar, pornographic, indecent, personality rights, damaging or similar nature in the nature of all kinds of financial content, legal, criminal and administrative responsibility for the content of the sender member / members are belong.