This Thursday I am fortunate to moderate the table dedicated to new technologies of the Annual Congress of Equality of the Madrid Bar Association, our Summit of Women Jurists.
As every year, successful women respond to our call and contribute their trajectory to the purposes we pursue: detect and report problems and barriers, propose and demand solutions, promote and nurture initiatives, detect and take advantage of opportunities. We promise to disseminate the findings.
On this occasion we will have not only lawyers from the world of consulting and human resources selection, but also STEM women (acronym for the acronym in English science, technology, engineering and maths). The transversal vision that they will provide us with from their deep knowledge of science, especially artificial intelligence, applied to the business sector and from a gender perspective, will not only enrich the audience, but will also show how long the road is. To go.
With the proposal for a Regulation of the European Parliament and of the Council that will establish harmonized rules on artificial intelligence on the table from April this year, governments, developers, companies, lawyers and public opinion are waiting expectantly. In its explanatory memorandum, the draft makes express reference to the possible discriminatory consequences due to sex derived from technical inaccuracies of artificial intelligence systems for remote biometric identification, technologies applied in education, vocational training and employment, among others. .
The interpretation of facts and people by a machine, which also learns from mistakes, is very complex, as much as the human being himself, and that is why we walk with this text towards an obligation to ensure and guarantee that the bias of the high-risk artificial intelligence systems are “monitored, detected and corrected” by responsible data governance.
But before the great success that I am sure will be the congress, today I want to tell you about a great failure. Or two, mine (sure) and the system (presumably).
I had prepared a surprise for my fellow guests with great enthusiasm. Weeks ago I formally asked dozens of Madrid schools to send us to the Madrid Bar Association drawings made by their students in which they reflected a person who worked in computer science and a person who worked in the legal world.
He intended to display these illustrations on the big screen during the congress: what sex the inhabitants of the future most attribute to the technological and legal professions, and if the bias changes with the change of stage between childhood-pre-adolescence-adolescence.
He tried to draw statistical conclusions, age ranges, sex, differences between types of center…. and show them off at our equality party. A serious study with relevant conclusions, go. I wanted to proudly mention the schools that have helped us. And so I told him and there, I think, the system failed me.
No school has wanted to undergo the cotton test, none of the public, private and concerted guests. No one has even responded. Why? Here goes my theory: none of them wants to take a test that concludes that the students of their center do not conceive the professions from equality. I think that far from seeing it as a self-evaluation exercise and, although they have excellent professionals, they have seen it as a potential risk of bad publicity. And the truth, with my heart in hand, is that I understand them. Equality is too hot a topic right now, throwaway, with too many currents and too many authorities in feminism deciding which people and institutions deserve the badge women friendly.
New times bring new jobs and new ways of performing historic professions, such as law. The proximity to new technologies and seeing them as a modus vivendi needs a lot more impetus than including a digital skills use it tablets in geography class. It is about showing them female referents from all fields. To introduce them to the business world. That they know entrepreneurship and the reality of the market while learning economics fundamentals. It is necessary to review and update the study plans annually and introduce technological innovations and support the wonderful staff of teachers in our country to keep up to date.
But what is more, statistics tell us how in most extreme cases of female submission (gender violence, discriminatory or precarious working conditions, exclusive raising of children), economic dependency acts as an anchor that prevents women (or makes it extremely difficult for him) to get out of the spiral he is in. The lack of qualified training constitutes an insurmountable barrier for their access or reincorporation to the labor market, placing them in that group at risk of exclusion that is increasing, especially with the delicate post-COVID conjunctural situation that affects us.
We create independent, educated and competitive young people in the professions of the future. You help yes, but self-criticism, too. Let’s keep a close eye on how the new artificial intelligence tools can help or affect the rights of our contemporaries and descendants, let’s fight tooth and nail that the recruitment and performance management platforms do not negatively affect them, that they make them occupationally visible and that they value adequately, but let us not stop, not one day, to train them in healthy competitiveness.
As a society, to continue fighting for real equality, we must apply what in artificial intelligence is called Bayes’ inverse probability law: let’s take all the data, analyze it and draw conclusions about the most probable cause that generates it. We want each other alive, also in the market, because autonomy is power and empowerment takes us away from roles.
Esther Montalvá, Deputy for Digital Affairs of the Governing Board of the Madrid Bar Association
Faith of errors