On Love & Data: To Be Heard Without Limitation

What—and whom—does technology recognize?
technology
Still from the Dark Matters: Bias in Speech AI roundtable at Pioneer Works, June 13, 2021.

Editor's Note: On Love & Data: To Be Heard Without Limitation responds to Johann Diedrick's Bias in Speech AI roundtable hosted at Pioneer Works during June Second Sundays, on June 13th, 2021. A video of the event follows Dinkins's text.

As we
(who is this “we” shaping the future?)
create, increasingly rely on, and live with advanced algorithmic technologies,
those technologies shape us in turn.

Johann Diedrick kicked off his Dark Matters: Bias Speech in AI roundtable with an account of his accented parents being misunderstood and unanswered by Alexa, a popular virtual talking assistant. In his account, Alexa misheard his parents’ requests and often offered demoralizing retorts of acknowledgement without understanding them. Such outcomes are common from many digital assistants that are not yet capable of deciphering much beyond middle ground, vanilla versions of the languages they are trained on. This leaves those whose language usage skews from the average feeling dejected and on the outside of what is deemed acceptable by the powers that be.

Roundtable participant and associate professor of English at the State University of New York Binghamton, Jennifer Lynn Stoever, summed up the problem nicely when she stated, “machine listening as a whole is primarily designed by US-based, white male speakers and listeners of English who have coded their biases, desires, and preferences into AI, claiming it to be a universal technology.” We, however, understand no technology is neutral. Its constructors, to date primarily white men, put great effort into defining and enforcing the norm—their norm.

This “norm” is used as a weapon to subvert other ways of speaking, being, and knowing. Understanding different dictions, for example, requires patience, deep listening, and the capacity to imbibe lyricisms different from one’s own individual or collectively normalized expressions. By calibrating our technologies to a corrupted, misrepresentative mean modeled on white desire and culture, we too often reinscribe a techno-supremacy that relies on and propagates subtle and not so subtle—in the case of Johann’s parents—messages not intended for those who fall outside of the average user as defined by white men, white institutions, and the status quo they maintain.

A question that keeps coming up in my own practice and that seems to be a core concern for Johann’s is: can we build care, support, and generosity into algorithmic systems that are becoming our de facto civic, governmental, and social infrastructures? This algorithmic care must extend far beyond the push toward capital, the mimetic, and the resistance of surface incompatibilities such as nonstandard accents. Natural Language Processing (NLP) systems must be challenged and trained to catch up to complex expressive uses of language that defy and expand upon standard usage of language. Black vernaculars, among many others, are a good example here.

It is unacceptable to allow the accented, the impaired, and those who speak a host of other languages that rely on subtle tonal shifts or their lyrical nature to be misunderstood or unanswered in the sphere of algorithmic understanding. Undervaluing and dismissing the communicative forms of communities pushed to the edges of society severely limits the knowledge, cultural richness, and possible solutions available for everyone.

Our technological systems, including those that communicate verbally, must be taught to understand, value, support, and safeguard the breadth of human and non human expression. This is no easy task for many reasons. As Johann said, “machine listening technologies are predicated on data being discrete, fixed, and unmoving, whereas black speech is fluid and dynamic and elastic and morphing all the time—autopoetic. And so, it is almost antithetical to machine learning as a practice, as a discipline.” This statement can be extrapolated to the field of AI writ large.

Take Not the Only One (NTOO) (2018), the multigenerational chatbot I conceived of as a family memoir and archive, for example. As wonky and uncommunicative as NTOO can be, its value lies in its ability to lay bare NLP’s inability to support the modes of expression, ethos, and values of particular communities.

NTOO points toward the potential for success using small community-derived data. It also attests to the value of including in this data the practices of signifyin,' speaking in culturally-rooted double or triple entendres, as well as communications of refusal, which refers to the ability to read and use the biases someone holds against you to your advantage—a tactic which admittedly is much harder to accomplish with a machine than with another human being. NTOO, however, seeks to utilize these practices as a testimony to the fact that such tactics have kept black families like mine buoyant, choice-rich, and capable of making ways out of no way despite the challenging socio-political juggernaut that is America.

Artist James Alister Sprang explores this when describing his project, Turning Towards A Radical Listening (2019), which includes conversations with poets who experiment with language as a technology. He said, “we talked about ways in which language isn’t necessarily coded to and equipped to represent the black experience.” He concluded his Dark Matters remarks with a reprise of his modus operandi, “listening transforms what it is we see. I try to make spaces for people to listen. And look. And become.” Imagine all that would become possible for the bulk of us if NLP systems propagated by big tech were form/informed by the careful, deep listening James and his collaborators practice and advocate for. As Nigerian linguist and cultural activist Kọ́lá Túbọ̀sún said in a slide presentation that explored some reasons and excuses often used to explain why African languages are not more available via NLP systems, “technology is a blank canvas that should and can be made to reflect different cultures where people use it.” At the end of his talk he plainly stated, “we need local work. Big tech needs to do something. We also need to do [our part].” I concur.

If we wait on those who lack the capacity to get it right for the communities we love and want to see thrive, we may be waiting for an eternity. If we begin to build or bend technologies to our needs and age-old understandings of the way things work—even in small ways—we may finally get the care and support all communities need to thrive in partnership with the algorithmic technologies underpinning, informing, and rapidly reconfiguring the world globally.


As we create, increasingly rely on,

and live with advanced algorithmic technologies,

those technologies shape us in turn.

What are we going to do about that?

(Not a fair question, but what is the alternative?) ♦

Play
Video
Dark Matters: Bias in Speech AI
MORE FROM BROADCAST
Change the frequency.
Subscribe to Broadcast