Interviews, insight & analysis on digital media & marketing

Biometric bias: Why isn’t facial recognition software built with transgender and non-binary communities in mind?

By Cindy White, CMO, Mitek Systems

For those who identify as transgender or non-binary, human-computer interfaces are almost never built with them in mind. Facial recognition software is a booming industry, set to be worth USD 8.5 billion by 2025. It’s found in everything from applications for banking apps to airport security. However, instead of making life easier, they reinforce existing biases and discriminate against these groups.

After a Pride-filled summer, it’s high time to put our collective education on equality into practice in a world where digital access for all is a daily requirement.

Why biometric equality is important

A lack of algorithmic inclusivity in the tech industry has led to digital exclusion and unintentional bias all because of unfairly built biometric systems.

Typically, facial recognition software determines the user’s gender simply by scanning their face and assigning male or female labels, based on previous data analysed and learned by the machine. Superficial features such as the amount of makeup on the face, or the shape of the jawline and cheekbones will put your gender into a binary category. As a result, these systems are unable to properly identify non-binary and trans people and match them to their official identity with ease. This is exacerbated by the fact that one third of those who have transitioned do not have updated IDs or records. As it stands, most facial recognition tech can’t accommodate these differences.

This is not an issue isolated to one brand of technology. Several researchers have shown how ineffective facial recognition technology is in recognising non-binary and transgender people, including Os Keyes’ paper on automatic gender recognition (AGR) systems.

This isn’t just a tech issue. It’s a societal failing

The inability to identify people within these groups has consequences in the real world. It can lead to people being mistreated, unable to get approved for financial products and services or facing issues with the government or police due to misidentification. People who aren’t represented lose the ability to be acknowledged and fight for their freedoms and rights.

Biometric technologies are not biased; the data is biased

The key to solving this begins with noting that biometrics are not within themselves biased; they are not making any decisions based on human values. Bias and inequality in biometric technologies are caused by a lack of diverse demographic data, bugs, and inconsistencies in the algorithms. For example, if the training data primarily includes information related to just one demographic, the learning models will disproportionately focus on the characteristics of that demographic.

To build equitable access for all communities using biometrics, we need to think through which is appropriate to prove identity, and which layers of biometrics are most appropriate for proving if the person is real. Organisations must think beyond the current framework.

Inclusive technology starts with the design process

Though technology cannot fully solve biases against minority groups, there are opportunities to create new technologies that may help address some of this discrimination. The key is to ensure that trans and non-binary people are involved in the design process. Research from the Universities of Michigan and Illinois conducted design sessions with 21 members of the community. They detailed four types of trans-inclusive technologies: those for changing bodies, changing appearances and/or gender expressions, safety, and finding resources. The results: centering trans people in the design process made the designs more inclusive.

Inclusivity starts with understanding minority challenges

Facial recognition tech is everywhere, but for those who identify as transgender or non-binary, human computer interfaces are almost never built with these communities in mind. We all need digital access. Shopping online, education, healthcare, dating apps. Many of us need these newly digitised services to live a full life. Today’s expectation is that technology solutions are unbiased, though it continues to be a widely discussed issue around this in this field, and one which must be urgently addressed. With more Britons than ever before identifying as LGBTQ+, it’s important that the broader ecosystem pays attention to minority communities. We could start by better understanding the challenges they face, and working as a community to offer alternative digital solutions.

Listen, learn and act

Success starts with listening before acting. To create a more fair and equal playing field in how artificial intelligence (AI) technologies impact our lives, we must listen, learn and act. As society strives for equality, safety, and fairness for everyone, it’s crucial that those who will benefit most from the technology changes are at the heart of the design process.