Panelists invited by Douglas Honors College speak about the flaws with artificial intelligence

Joshua Smith and Addie Adkins

What is Artificial Intelligence? This was the first question given to panelists at “Artificial Intelligence: The Future is Now.” On this point, the panelists seemed to agree that none could offer a “good definition.” Panelists Stephanie Dick, Halcyon Lawrence and Luke Stark spent several minutes explaining why they couldn’t provide such a definition.

All panelists are assistant professors at various universities. Dick is an assistant professor of History and Sociology at the University of Pennsylvania, and Lawrence is an assistant professor of Technical Communication at Towson University, Stark is an assistant professor of the Faculty of Information and Media Studies at the University of Western Ontario.

The event on Thursday, May 13, discussed the ethical dilemmas faced by the tech industry when designing software, programs and/or gadgets for public consumption. 

Dick, the first to present, discussed briefly the effect that artificial intelligence had on the incarceration system. She spent time exposing systems that associated mentally unwell individuals with individuals who are likely to commit crimes. She described how individuals are broken down into data points which can be referenced by law enforcement personnel, and then used to indicate potential criminal activity. 

Dick goes on to describe that these systems were presented as almost infallible based on their unbiased design and not being involved in existing law enforcement systems, using the New York State Identification and Intelligence System (NYSIIS) as her primary example. 

“[The NYSIIS system] would be concerned with the information that all the different branches of New York law enforcement were responsible for gathering, and they would centralize it, standardize it and facilitate and oversee communications between the different branches of law enforcement,” Dick said.

However, Dick further explains that this is a false assumption. She said the responsibility of determining whether someone will commit a crime based on an “if, then statement” falls on artificial intelligence due to converting people to data points. 

Lawrence, who specializes in speech, read from her essay “Siri Disciplines” which was published in a book titled “Your Computer’s on Fire.” 

From her reading, she described how voice technology limits users by conforming their speech to a standard English without an accent. Lawrence gave the example of Siri, how when some people speak it leads to confusion and frustration. As for her, with a particular accent referenced as Trinidad English, she said “there is silence when I speak to Siri.”

She went on to speak of how tech industries describe their voice technology as revolutionary, but Lawrence states, “What is the experience of accented speakers for whom speech might be the primary or singular mode of communication? This so-called revolution has left them behind.”

Lawrence also spoke of a study done by Stanford researchers done in March of 2020 on recognition errors. She said these researchers analyzed data between African Americans compared to their white counterparts that showed two to three times the amount of verbal recognition errors done by speech devices such as Siri, Alexa and Google Home. 

Stark spoke about emotion and the ethical dilemma of using technology to understand and comprehend emotion, mainly, contemporary emotion recognition systems. He said there are two problems with these systems detecting emotions, “the problem of parts,” which is the many facets of emotions, and “the problem of plenty,” which is how these facets come together to create the emotion.

According to Stark, these systems and technologies attempt to do something that is impossible, using the example of taking expressions and profiling faces in order to distinguish criminals by their face. Stark said the main question needed to be discussed is whether it is ever ethically appropriate to develop and use systems that detect emotions and expressions.

Stark ended his presentation with, “You know the problem of parts is basically insurmountable here, right? There’s no way these systems are ever going to do more than capture a slice of our emotional expression, and not even do that very well.”

Those interested in watching the event can do so by reaching out to Tamara Caulkins, Historian of Science at Douglas Honors College at CWU, or go to http://www.kaltura.com/tiny/mcky1.