Language is a powerful tool, capable of capturing the essence of cultures and the richness of human expression.

As a presenter, I have spoken at various events about Indigenous Accessibility and how systematic challenges such as compliance rules or software limitations create barriers to developing accessible design solutions for Indigenous audiences.

As an accessibility advocate who works with Indigenous communities, many of my presentations include words from Indigenous languages. If we consider the Anishnaabemowin word “Maamwizing,” for instance, which roughly translates to “coming together” in English, we unravel the intricate layers that compose the word. Each element of the word is thoughtfully woven together to construct a profound meaning that goes beyond a simple translation or pronunciation in the English language. It’s essential to note that existing screen readers lack support for Indigenous languages, resulting in inaccurate translations that fail to capture the nuance and power of these words’ meanings, ultimately transforming them into English translations that are not only incorrect, but also less meaningful overall. This emphasis is pivotal, as it brings to light the inherent deficiencies stemming from the current absence of screen readers equipped to comprehend Indigenous languages.

However the problem with biased software doesn’t stop there.

In fact, my experiences with automated captioning software during my presentations have revealed a troubling issue – the software consistently misinterprets and misrepresents Indigenous words, leading to a loss of meaning and cultural understanding. In this article, I will share my journey and highlight the critical importance of meaningful captions in preserving linguistic diversity and cultural heritage.

Preserving Indigenous Languages and Cultures

Indigenous languages, including Anishnaabemowin, are repositories of cultural wisdom and unique ways of knowing the world. However, these languages have a history marred by discrimination and erasure. By overlooking Indigenous languages in software development, we perpetuate their exclusion, further contributing to the loss of linguistic diversity and cultural heritage.

Elder Stan Peltier aptly noted, “Language is a guide and concept. Finding English equivalence for concepts may mean it’s lost in translation.” Preserving Indigenous languages and cultures should be a collective responsibility to foster a more inclusive and accessible society.

The Impact of Automated Captioning on Language and Meaning

Automated captioning software has transformed the way we capture and process spoken content. However, when it comes to Indigenous languages such as Anishnaabemowin, these tools face significant challenges. The complexity and intentionality of these languages often elude the software, resulting in misinterpretations that alter the fundamental meaning of words. For instance, the word “Maamwizing,” has been rendered as “mom was in,” “mom was saying,” or even (my worst nightmare) “mom whizzing.” This distortion not only misinterprets the word’s significance but also disregards the intricate nuances embedded within it.

The Consequences for Audience Understanding and Engagement

The consequences of inaccurate captions are felt acutely by the audience, especially those who rely on live captions to understand the presentation. They now encounter a barrier that obstructs their grasp of the true meaning and disconnects them from the cultural essence I am trying to convey. This experience underscores the importance of raising awareness about the limitations of automated captioning software and the need for more accurate and respectful handling of diverse languages and cultural expressions. This topic has been tackled by Meryl Evans, who has also discussed the limitations of automated captioning software in depth.

Addressing Bias in Software Development

Automated captioning tools are products of their development environment. If this environment lacks inclusivity and representation, the software will inherently be biased toward those it is designed for. Indigenous Peoples and languages have often been left out of this process, leading to the inaccuracies we have and continue to observe in auto-captioning. To rectify this, we must involve Indigenous communities and linguistic experts in software development to ensure a more balanced and inclusive representation.

An alternative to automated captioning software is the use of live human captioners. Live Captioners have an edge over automated captioning tools when it comes to interpreting context. So even during a presentation if something comes up on the fly in conversation a human will always be better at guessing that “mom whizzing” is not likely the correct caption given the subject matter of the presentation.

Recommendations for More Meaningful Captions

While I am not fluent in Indigenous languages, my experiences have taught me the importance of involving Live Captioners who can comprehend the nuances of the language and respect its cultural significance.

Andrew Olson, of A11yTalks suggests that “By using A11yTalks non-profit funding wisely, we provide human captioning through the company AI-Media (or other reliable captioning [agencies]) and request presentation materials before events to avoid these situations. Also, A11yTalks requests speakers to sign on 30 minutes before the event to handle these types of scenarios and ensure speakers are ready to share their materials in an accessible way with the audience.”

Engaging in pre-presentation conversations with a Live Captioner provides an opportunity to explain the intention behind specific words and phrases, ensuring more meaningful and accurate captions.

Mary Birnbaum, of AI-Media also gives this advice: “CART provider/live captioner can be sent presenters’ presentation documents in advance of the event to be captioned. This advance information is often so helpful for captioners and often addresses many of the questions or corrections that can come up in the 30 mins pre-conference set-up time. A captioner who has been prepped will likely arrive to the session with any questions they have, having reviewed the presentation, which allows set-up time to flow efficiently.”

“Our organization sets high expectations for the benefit of the audience,” explains Andrew Olson, of A11yTalks, “ A11yTalks takes this responsibility seriously and tries to be clear about it on our website.”

Preserving linguistic diversity is not just a matter of respecting languages; it is about embracing the identities and cultures they embody. Automated caption software, while convenient, poses challenges for Indigenous languages like Anishnaabemowin, distorting their meanings and erasing their significance. We must advocate for more inclusive software development processes and prioritize meaningful captions, because doing so ensures that no language or culture is left behind. By cherishing linguistic diversity, we foster a more empathetic and inclusive world, where every voice finds its place in the symphony of human expression.
Special thank you to Mary Birnbaum, of AI-Media and Andrew Olson, of A11yTalks for their insights on this topic!

Meggan also spoke with Mashable about this topic. You can read that here!