Post a Comment Print Share on Facebook

<<We already trust machines more than humans themselves>>

Gemma Galdon, an analyst and researcher, founded Eticas Consulting, which performs algorithm audits to help companies identify any biases in the data they use.

- 1 reads.

<<We already trust machines more than humans themselves>>

Gemma Galdon, an analyst and researcher, founded Eticas Consulting, which performs algorithm audits to help companies identify any biases in the data they use. This is done to prevent discrimination and unequal opportunities, improving transparency.

What was the inspiration for Eticas Consulting?

They continued to provide me with funding for my research even after I had finished my thesis almost ten years earlier. This financial support was rare. It wasn't often that people worked in the same area as technology or that someone so young had no permanent position at the university. They invited me to leave after I had raised the one million euros in external funding. I brought the projects home and created a legal entity in order to manage them. After I had completed them, I hired people. We continued to raise money to study the effects of technology on individuals. After several years, I decided to become more proactive and respond quickly to the demand for research to better understand the impact of technology on society.

- What were your difficulties?

We have been extremely fortunate, I always believe. We have the interest of many public and private actors, even though we don't want it. It was something I had never considered. I have been overwhelmed by the responsibilities of running an economic organization. It has also cost me a lot to find support groups that understand what it takes to be an entrepreneur woman. You are not born with the entrepreneurial spirit. He didn't always ask for money but just needed help.

- Why did it take you so long to start?

- For liability. Your projects are similar to your children. Although I didn't have any entrepreneurial spirit, I committed to finishing the investigation. These ten years have seen many challenges. Although we no longer have "do-or-die" projects, I believe that algorithmic auditing should become a standard. Bad algorithmic decisions can affect people, but it is not visible. This is why we do what we love; it is rewarding.

- How do you know if there are problems with the algorithm?

- Based on sociotechnical characteristics. Many people use technology from outside. I use it from inside. While I was aware that there were problems, I also realized that changing the way you code can eliminate those social impacts. These problems are not caused by bad faith but ignorance. The engineers were not informed that they would discriminate against specific groups if they didn't take into consideration each group.

What are the obstacles?

Biases are permanent. There is much talk about them. Algorithms are designed to discriminate and bias. It is not about who or how. Algorithms reward the most abundant norm. Bank systems, for example, are built on historical data and continue to reward men. It is determined that the man is the best client, and the woman is the most risky client by looking at the 'big data' from the past. They get ten times more credit even though the record of the woman is better and they are better payers. This statistical discrimination is not prevented by anyone cleaning the data.

These biases can be fixed. Technology or is it?

Technology is both the problem and solution. It would be wonderful if more people could learn digital skills and engineers were more humanistic. Although artificial intelligence and data are still very much a part of engineering, they should have an understanding of the social impact. Multidisciplinary teams are required to achieve this. This is something I have seen for ten years. There is a loss of trust between technology and society. It has been a great abuser of citizens. Now we need to recover our space and make better technology.

Are the people aware of this?

- Because there is so much opacity about the processes, no. It is important to know when we are being subject to automated processes. It will, I believe.

- And the companies

- Either. It is possible to have bad faith but not ignorance. Technology is believed magical. Technology is not able to make mistakes. We trust computers more than we trust humans, and we don't protect ourselves against the bad decisions made by algorithms. Some algorithms are poor quality and can lead to poor programming decisions.

Gemma Galdon has received numerous awards for her professional achievements. She says, "They are very important, but they're not enough." "More is needed for women to feel at ease," she says. "Each project is a reward for me, but we lack female ecosystems. We need more women in these positions."

While some edges can be changed quickly, others will take longer. We must be more visible, be trusted to lead and be there for other women. She claims that entrepreneurship also requires a deep understanding of women. She says, "Code ethics is dominated by women. But in Spain, innovation is not possible because it isn't like a new device." This means you can avoid a lot funding and aid.

Galdon insists that innovation is not about creating socio-technical space. She adds that many women are improving processes and thinking outside of the box. Galdon says, "And you must give visibility to that."

Avatar
Your Name
Post a Comment
Characters Left:
Your comment has been forwarded to the administrator for approval.×
Warning! Will constitute a criminal offense, illegal, threatening, offensive, insulting and swearing, derogatory, defamatory, vulgar, pornographic, indecent, personality rights, damaging or similar nature in the nature of all kinds of financial content, legal, criminal and administrative responsibility for the content of the sender member / members are belong.