Technology
AI
C
hristina Blacklaws considers if we are sleepwalking into a nightmare world of technological bias. Christina is the former President of the Law Society of England and Wales, business adviser to a range of Boards and businesses on transformational change, technology development and diversity and inclusion. The future is biased? There’s fascinating research which shows that from stab vests to the size of our mobile phones to crash test dummies, the world is basically designed for men.1 The recent debacle about PPE in hospitals being designed for the typically larger frame of men, despite a very high number of staff being female, shows this problem hasn’t gone away despite being the subject of a detailed TUC report in 2017.2 Women are the majority of the global population but by no means the dominant force. And this rings true in the UK, in our world – the law, where although we women make up more than half of practising lawyers, only 31% are partners and an even smaller number are business owners. I wanted to get to grips with this when I became President of the Law Society of England and Wales. So, we conducted the largest ever global survey and completed 250 roundtables in 19 countries to discover why female lawyers were not getting to positions of seniority which were commensurate with our desires, our knowledge, our experience and our sheer numbers.3 And what did we find? Systemic bias. I’d assumed that the lives of women lawyers across the globe would differ wildly, but our research found that the challenges and experiences of female lawyers around the world are very similar. We face social and cultural expectations about how we should behave and what we can and should do because we are female. In every country, this gender stereotyping prejudiced working women, particularly those who had children, and prevented 12 | LegalWomen
capable, excellent, committed women from realising their ambitions. How wrong is it that today anyone is not able to progress due to their gender, race, or any other protected characteristic? Surely, we must all agree that talent is equally spread and that everyone should have the same chances and opportunities as everyone else but, in an esteemed profession like the law, we found that not to be the case for women. This is all really concerning but what I want to focus on is how it could get an awful lot worse! Most of us live with the hope and expectation that things will improve with the greater use of technology. That computers are objective, non-biased, fair and that their decisions can be relied on. However, I’m worried that as we move to an increasingly automated, data-driven world where the computer decides almost everything, all the bias that we know exists, could be hard wired into future decision-making. Bias, frankly, blights our world and makes it an unfair and unkind place for people who aren’t from the dominant culture – and this includes women, people of colour and the disabled, to name but a few. So, the stakes are high, and the not-too-distant future could be really scary. First off, let’s understand how computers make decisions The computer is programmed with a set of instructions called algorithms. The algorithms are fed information so they can learn and understand. They are then able to apply themselves / the instructions to any other information and come up with answers. Well, what’s wrong with that, you may ask. All seems perfectly fair and reasonable, objective etc. But there are two big problems here: ■ how the algorithms are programmed and by whom, ■ the information – the data – they are fed on.