Adherence to the declaration will prohibit researchers from working on robots conducting search and rescue operations or in the new field of “social robotics”. One of Dr. Bethel’s research projects is to develop a technology that would use small humanoid robots to interview children who have been abused, sexually abused, trafficked, or otherwise traumatized. In one recent study, 250 children and adolescents interviewed for bullying were often willing to trust information to a robot that they would not disclose to an adult.
Having an investigator “drive” a robot to another room can lead to less painful, more informative interviews with surviving children, said Dr. Bethel, a trained forensic expert.
Dr. Crawford is among the signatories of both an open letter to No Justice, No Robots and Black in Computing. “And you know, every time something like this happens or is realized, especially in the community where I operate, I try to make sure I support it,” he said.
Dr. Jenkins refused to sign the No Justice Declaration. “I thought it was worth thinking about,” he said. “But in the end, I thought the bigger problem was actually the representation in the room – in the research lab, in the classroom and the development team, the executive board.” Ethical discussions must be rooted in this first fundamental issue of civil rights. he said.
Dr. Howard did not sign any of the statements. She reiterated that the biased algorithms were partly the result of a distorted demographic group – white, male, able-bodied – that designed and tested the software.
“If outsiders who have ethical values do not work with these law enforcement agencies, then who is?” She said. When you say no, others will say yes. It’s not good if there’s no one in the room to say, “Um, I don’t believe a robot should kill.”