Post a Comment Print Share on Facebook

Regulate to not be regulated

About a year ago, the European Commission published a proposal for a regulation on Artificial Intelligence.

- 33 reads.

Regulate to not be regulated

About a year ago, the European Commission published a proposal for a regulation on Artificial Intelligence.

The proposal is part of an apparent race by the European Union to regulate the digital and technological environment in response to many reasons, among which we can include a global context of political and social upheavals that, correctly or incorrectly, have been attributed - among other things - to the digitization of discourse and social life.

This great regulatory effort has generated important proposals such as the Digital Services Act (DSA) or the Digital Markets Act (DMA), but the nature of the Artificial Intelligence Regulation seems different. This is so because, while the other proposals (such as the DSA or DMA) essentially attempt to regulate the business or management of certain technologies or digital environments, such as social networks, for example, the Artificial Intelligence Regulation proposal goes further and it effectively prohibits or restricts the very development, marketing and use of certain technologies. In other words: the proposal intends to regulate the technology itself, and not only the result of its use.

Here is the first problem of the proposal. Regulating technology is a daunting, difficult and often useless task, especially when we talk about products that, as happens with Artificial Intelligence, in many cases are just software or lines of algorithms. Add to this the characteristic opacity of said algorithms, the difficulty of clearly identifying failures in their use, or those responsible for damages caused by incorrect or improper use, and what we will have is a highly regulated and bureaucratized environment, but one that is not necessarily will lead to effective social protection.

Furthermore, the European Union's rush to regulate vast aspects of digital life appears to be motivated by fear. Fear of technology itself, fear of being left behind in the digital economy, and fear of regulatory and political fragmentation that harms the very essence of the common European market and of the very institution of the European Union, which would be supplanted by the immense power and size of so-called big tech.

Having fallen behind the US and China in the technology race, Europe is not the birthplace of any of the digital giants and, to some extent, finds itself hostage to a digital culture formatted under alien values ​​and legal contexts but which However, they project and exercise enormous power both over their economy and at a social level. Bearing in mind this enormous concentration of economic and political power of the big techs - and the fact that, by the very nature of their activities, this concentration of power tends to be ever greater - the European Union has been forced to react. And it has reacted in the way it knows how: regulating by right so as not to be, itself, regulated de facto.

But fear is not a good legislative adviser and, in general, leads to the hasty creation of regulations that are counterproductive or very difficult to apply, which, by generating a false sense of security, can create an even more vulnerable society, quite the opposite of what is intended.

This does not mean that we should not seek some form of control over the use of new technologies; If there is one thing that the turbulence of recent years has taught us, it is that society needs mechanisms to deal with the huge power imbalance between technology companies and citizens. The European proposal is a first effort, but the technological changes that we will face in the coming years will require a much broader and more creative spectrum of action, beyond mere generic restrictions in bureaucratic regulations.

Avatar
Your Name
Post a Comment
Characters Left:
Your comment has been forwarded to the administrator for approval.×
Warning! Will constitute a criminal offense, illegal, threatening, offensive, insulting and swearing, derogatory, defamatory, vulgar, pornographic, indecent, personality rights, damaging or similar nature in the nature of all kinds of financial content, legal, criminal and administrative responsibility for the content of the sender member / members are belong.