Post a Comment Print Share on Facebook

What you should fear artificial intelligence?

among all the revolutions that come to shape our future, the one related to the artificial intelligence (AI) is probably the greatest impact you can have. But a

- 19 reads.

What you should fear artificial intelligence?

among all the revolutions that come to shape our future, the one related to the artificial intelligence (AI) is probably the greatest impact you can have. But also the more uncertainty it generates. The news about the progress in this branch of science seem to give a lime and another of sand. One day we are informed about its use to prevent the illegal hunting or to detect emerging infections. The next day, we are reminded of the risks of a bad use of the technology.

these newsgroups are added to the open letters and communiqués of the personalities in disagreement as Elon Musk or Bill Gates. Warned of the danger of employing AI in the arms industry. And there are those skeptical of the discipline, who believe that you have to be cautious with these visions overly apocalyptic or enthusiastic. With such a quantity of views, how they affected the perception that society has of this area?

MORE INFORMATION

Experts in artificial intelligence call for the creation of an agency in Europe to monitor your development computer select your resume and know when you lie

Elisabet Roselló is of the opinion that all these narratives are penetrating the citizens, but a large part of society unknown almost completely any advancement in artificial intelligence. This consultant in strategic innovation and research of social trends, he believes that “the population can recall some anecdotal reports that have appeared in the news of the tele, but neither will know of the existence of these open letters”.

For Roselló, disputes arising out of the use of the AI “seem to that a few years ago were born with the genetic research. There were movements opposing as the religious, but the research has continued forward, always within a regulatory framework that, yes, you should be defined in detail for the AI”. A few frames which in turn should be rewritten, because, according to the founder of Postfuturear, “do not work as designed two centuries ago. Non-state actors such as Google or IBM have a lot of power and may go beyond these regulations.”

Big companies that, along with the universities, are creating many jobs related to artificial intelligence. The research in this area is being strongly backed by businesses and governments. According to dr. Francisco R. Villatoro, professor of the department of Languages and Computer Science in the area of Computer Science and Artificial Intelligence, University of Malaga, there are “no cuts on research in AI, on the contrary, there are several countries who each day gamble more strong by these technologies. And many technology companies”. If there is fear for the advancement of this technology, does not seem to be Casino Siteleri perceived in a large part of public and private investment. But Google did not renew the contract that united them to the Pentagon-level artificial intelligence. How a movement is carried out to maintain a corporate image away from the area weapons?

“Look to see what ideas so far-fetched and unthinkable a decade ago as the universal income are already being put on the table today

In the ICRAC, the international committee for the control of robotic arms, supported the initiative of Google with another open letter addressed to the leaders of the company. In it, 1179 researchers and scientists unite in solidarity with the more than 3000 Google employees to leave to develop military technology and store personal information for military purposes. Peter Asaro, one of the spokespersons of the charter of the ICRAC, believes that this rise of open letters against the use weapons of the AI comes from the fact that “the population is discovering the negative effects of these technologies after many years of optimism undisputed”. According to Asaro, “also the workers of the technological realize the ways of being accomplices and are beginning to organize to shape the morals of their companies.”

The future

On the future of research in AI, Asaro is blunt: “stop the research on autonomous weapons and to establish regulations on their use. There may be many futures for the research in AI, and the autonomy and choice of goals are just two of them.” That would be, according to the researcher, a great step to transmit the whole of society that there are no dangers in the evolution of the AI. “We also need greater transparency and accountability in the use of AI on the part of large companies,” he concludes, “where to stop, clear the assumptions from which they build, or the uses that are going to give.”

all in all, there is a fear to the IA between the bulk of society, this seems to be more related to practical aspects of day-to-day. As the future of work. In that sense, there are two relevant agents in the way in which to communicate the effect of the IA: the World Economic Forum and the Singularity University. “Are entities that promote the economic evolution,” says Rossello, “but they can sin in a certain ambiguity when it comes to communicating. Sometimes resort to a show of terror, and this promotes a widespread fear of losing the job among the people least prepared.”

A fear of that series, and films do not cease to convey through fiction, speculative or science fiction. Stories that perhaps we should pay as much attention as the news that come to us from research centres or experts in the field. “Look to see what ideas so far-fetched and unthinkable a decade ago as the universal income are already being put on the table today”, says Guillem López, a novelist of science fiction. “The idea of a society in which there is no work for more than half of the population starts to become palpable. However, maybe you should consider it is the end of wage labour and private property as we have known until now. This is where the fiction portrays the future not only plausible but probable and, also, necessary”.

Avatar
Your Name
Post a Comment
Characters Left:
Your comment has been forwarded to the administrator for approval.×
Warning! Will constitute a criminal offense, illegal, threatening, offensive, insulting and swearing, derogatory, defamatory, vulgar, pornographic, indecent, personality rights, damaging or similar nature in the nature of all kinds of financial content, legal, criminal and administrative responsibility for the content of the sender member / members are belong.