Title of article

Broadening the Circle

Illustrations by Luisa Jung
by John Tibbetts
Design justice calls on tech creators to promote fairness and broaden social benefits of their inventions
The Letter W dropcap

hen a designer creates a new technology, who might use it — or misuse it? Who could be left out, unable to use it? Whose life could it improve? And whose life could it make worse?

Professor Evan Peck, computer science, wants his students to ask those questions every time they write a new piece of software code, so it becomes a professional practice. Technology designers can unwittingly embed unfairness or other bias into their increasingly powerful and pervasive inventions. Peck uses an exercise to reveal how bias can sneak past students’ best intentions.

“In my Introduction to Computer Science class, students create a form for a website, which requires the user to provide a phone number,” says Peck. “The form must reject any ‘bad’ phone numbers. Students almost always write a program that accepts phone numbers from the United States but not from almost any other country. One of the first semesters I did this, a student from China spoke up: ‘This form wouldn’t work for me.’ ” Her phone number in China would have been rejected because it was considered bad input.

“The goal is to help students think about who may be empowered by their designs and who might be unable to participate,” Peck adds. “You should need different perspectives, helping you identify and overcome biases. It took having someone in the class who was not from the United States to prevent students from making a mistake.”

Peck belongs to a growing vanguard of professors at Bucknell and other universities who embrace a philosophy known as “design justice.” Design justice calls on technology builders to consider bias and other social-justice implications in every stage of their creations. What are the possible trade-offs of a new tool? Could an innovation, for instance, provide some social or economic benefits but also cause discrimination or exclusionary outcomes? Does a new technology primarily advantage the privileged while limiting opportunities for the disabled, minorities, unemployed workers, immigrants or other populations?

Illustration of people climbing
At Bucknell, professors are embedding design justice in multiple courses and programs, fieldwork, research projects and real-world scenarios. Their longer-term goal is to broaden design education and students’ thinking to include more knowledge of the world and social and economic power structures.

Calls for design justice have become more urgent because of mounting evidence that biased computer systems are not uncommon. Take the case of Amazon’s controversial tool called Rekognition, which the company began marketing to police departments in 2016. Police would upload an image of a potential suspect, and the software would scan for matches among jail mug shots, driver’s license photos or other identification images. Systems like these have helped law enforcement catch dangerous criminals. But Rekognition misidentified people of color at a much higher rate than people with lighter skin shades, allegedly leading to wrongful arrests.

In June 2020, during nationwide protests over police violence and racial discrimination, Amazon implemented a one-year moratorium on law-enforcement use of Rekognition. In May 2021, the company indefinitely extended that moratorium. Other facial-recognition tools used by police have encountered similar issues.

“We are realizing that damage from these programs can be far more insidious than we thought,” says Peck. “Most computer science and engineering students have come to believe that the products they create are completely neutral in their effects. But they are not neutral. For me, design justice is about getting students to consider how the systems we build can intersect with other communities and cultures, and how those systems either support or push against societal structures that have historically shaped those communities.”

Designers are not intentionally making biased systems, says Peck. “But as soon as you create something and send it into the real world, it can get repurposed in ways that the builders never imagined, or you’ll discover it has faults that the builders never imagined were there.”

The tech industry needs more diverse teams to reduce the chance that discriminatory algorithms and other errors get repeated.

“Teams with a wider range of lived experiences and perspectives have a lower likelihood of having an unconscious bias finding its way through to the final design,” says Patrick Mather, who served as the Richard E. Garman Dean of Engineering from 2016 until this August. “If you have a broader team, one that listens to one another and functions well, with more people at the table, you are likely to have better innovation and more robust design.”

Local Capabilities, Local Solutions
Some design-justice principles sound simple, but they can be humbling to carry out in practice.

“As engineers, our assumptions — our leanings — are always to design machines,” says Professor Charles Kim, mechanical engineering. “But a new machine might not be needed, and a better option instead could be finding a way to use what’s already available to solve a problem.”

In some Central American countries, millions of people who need eyeglasses can’t afford them. In 2010, Kim and his students began working on technology to lower eye-care price tags there.

“Many optician labs use computerized equipment to shape and fit a blank lens with a correct prescription into a frame,” says Kim. “We spent a long time trying to design a lens shaper to create a supereconomical product for people who need eyeglasses. But we eventually learned that we could not create nearly as much value with new technology as we could with existing lens shapers if they were used and maintained reliably in that local environment.”

Kim and his students, collaborating with colleagues and students in the Freeman College of Management, matched a lens manufacturer’s foundation with an existing nonprofit organization to improve access to eye care in the region. “The best thing we did was connect the foundation with the NGO [nongovernmental organization] and allow them to take it from there,” Kim says.

The project underscored the importance of designers broadening the circle to include users and other participants who bring different kinds of expertise to the table when devising solutions to problems.

“Design justice is about collaborating and learning together with stakeholders,” says Kim. “We should get to know the stakeholders and understand the system in which they operate. We don’t want to create an artificial solution for a group that does not need it and would not benefit from it.”

Design justice calls on practitioners to expand opportunities for those historically left out. Professor Amal Kabalan, electrical & computer engineering, is inspired by efforts around the world to offer low-cost solutions that allow girls and women to advance their education and skillsets. “Over the past 10 years, we’ve seen impacts from many engineering projects aimed at improving lives and safety in low-income communities simply by providing access to electric light and power,” says Kabalan.

Kabalan and her students have explored various prototypes of inexpensive, energy-storing book bags called SolarBrite Backpacks. Each bag contains a small solar panel and a detachable battery pack in its back pocket. The solar panel charges the battery, which can be plugged into an LED flashlight. “Many girls can’t study at night because it can be unsafe to go out to study under streetlights,” she says. “They can fall behind in their studies and drop out of school.”

In 2019, Kabalan and Shehryar Asif ’21 brought solar backpacks to a makeshift school for Syrian refugees in Lebanon. They offered 20 female students a two-day workshop on technology and solar energy, giving them each a solar backpack to keep.

“In this project, we explored the importance of decentralizing power sources,” Kabalan says. “We wanted to demonstrate to students that a single solar module can power vital equipment in a household. We also hoped to inspire young girls about engineering and its potential applications that impact our daily lives.”

“If accessibility is part of your development process from the start and you use the right tools, it doesn’t necessarily take a lot of money to do the job well.”
Professor Anne Spencer Ross
While Kabalan works to reduce the most recent prototype’s manufacturing costs, she is simultaneously widening her students’ perspectives. “I strive to equip our students with strong technical skills to establish their careers,” says Kabalan. “But I’d also like them to see interconnections between technology and society. Our students are empowered to put their skills toward goals of social justice, allowing people of different cultures, races, genders, economic status and sexual orientation to have equal access to technology, opportunity and education.”

Tech designers increasingly live and work in tony communities among people who share their privileges. Tech design has become the province of younger workers — well-educated, affluent and able-bodied — whose colleagues and neighbors are also active, tech-savvy people. The needs of people with disabilities can be forgotten or overlooked.

Many mobile apps are unusable for people with vision or fine-motor impairments, according to research by Professor Anne Spencer Ross, computer science. Accessibility issues also affect people with learning, cognitive and situational impairments.

“Developers and designers might not have much personal exposure to accessibility issues, so they might not think about it at the beginning of creating technology,” says Ross. “Individual developers who do try to address accessibility often have managers who aren’t supportive or undermine their efforts. If companies wait too long and then try to address accessibility later in the design process, then it can be perceived as a burden, an added overhead, expensive and time-consuming to get right. But if accessibility is part of your development process from the start and you use the right tools, it doesn’t necessarily take a lot of money to do the job well.” That’s why innovators could benefit from “co-design,” bringing product users into the creative process early and acknowledging their expertise.

Legislation with incentives for accessibility could motivate tech managers and leaders who may be laggards in this area, driving changes in company culture. “Developers and designers at the grassroots level need more incentives to support their efforts to improve accessibility,” Ross says. “And, similarly, let’s incentivize instructors and students to address accessibility features. At Bucknell, I hope to work with my colleagues to incorporate accessibility into our teaching and curriculum, trying techniques like grading and teaching accessibility as a core standard. It should be as important as teaching about security or good visual design.”

Technologies as Amplifiers
In addition to being a homogenous group, in some ways, technology designers also have a tendency to view their creations in glowingly hopeful terms while being unaware of potential harm caused by racial or other biases in their work.

“Designers want to believe that their technologies will fix something,” says Professor Chris Dancy, computer science. “Their goal or intention for technologies may be optimistic, but the negative outcomes on certain groups may not be new at all. If you understand American history, you might recognize that a new surveillance tool like facial recognition in the hands of police could be a bad idea, especially for certain groups. If you had included the voices of marginalized people in developing that kind of program, they might have warned about some of these problems.”

Facial-recognition systems are just one example of the AI predictive algorithms that permeate modern life. Predictive algorithms are intrinsic in two products that many millions of people use every day: Google Search and Facebook’s News Feed. They determine which ads we see on the internet. Their uses have many benefits, including tracking disease outbreaks and identifying cancers in medical imagery. Predictive algorithms will be critical in developing self-driving cars and “smart cities.”

Braille through Twitter
But predictive algorithms can reproduce — and even enhance — racial or other biases inherent in the databases used to train them. In a notorious 2015 case, the Google Photos facial-recognition app inadvertently labeled images of two Black people as gorillas. A Harvard researcher studied Google and Reuters searches for Black-identifying first names — such as DeAndre and Kareem — along with white- or neutral-sounding names. The Black-identifying names were 25% more likely to bring up internet ads for websites that check a person’s criminal record.

“These programs can reflect or build on patterns and prejudices deeply embedded in history and society,” says Dancy.

These are thorny issues to confront, and sometimes design students push back — hard — against discussing these principles in the classroom. In her Sustainable Design course for senior-level students, Professor Deborah Sills, civil & environmental engineering, introduces her students to social-justice problems associated with inequitable access to water and sanitation.

“We talk about places in the U.S. where people don’t have clean water or functioning sanitation systems, and those places are frequently in minority areas,” says Sills. “These issues should be addressed in engineering education, but traditionally they are not. Some of my students have asked, ‘Why didn’t we hear about any of this stuff before?’ But some other students say, ‘I don’t want to hear about it. It’s politics, not design. I came here for you to teach me design and calculations.’ But this is design.”

Sills uses these encounters as an opportunity to teach soft skills for engaging in difficult conversations, which are important not only for inclusive design but for engineers to succeed in their frequently collaborative, team-based fields. “I hope it will be useful in their careers to be able to talk with someone you disagree with and remain respectful,” she says.

“Convergent problems require many different disciplines working together, and one element of design justice is getting people from different backgrounds to look at a problem from all angles.”
Professor Alan Cheville
Broader Thinking
Professor Alan Cheville, electrical engineering, asserts that engineers need to broaden their training and engage with ideas that cross traditional academic disciplines.

“The defining sin of engineers is hubris,” says Cheville. “Engineers can come out of school with a lot of arrogance, and research shows that throughout their undergraduate degree program they can become less interested in the bigger issues as they become even more narrowly technically focused.”

The National Science Foundation recently awarded nearly $2 million from its Revolutionizing Engineering Departments program to Bucknell’s Department of Electrical & Computer Engineering.

“With this grant, we aim to introduce convergent problems, the big problems of the world, across the curriculum,” says Cheville, the grant’s principal investigator. “Convergent problems require many different disciplines working together, and one element of design justice is getting people from different backgrounds to look at a problem from all angles. No single discipline is going to address a convergent problem. So, we’re interested in changing grading structures to identify and develop characteristics of engineers who can tackle convergent problems in collaboration with others. We’re having discussions to better understand the mindsets of students who can do that successfully.”

What sorts of characteristics are needed? “Students must be able to communicate the technical knowledge of their discipline,” says Cheville. “But they also must be interested in things outside of their disciplines and able to talk to others. And they must recognize the boundaries and limits of their knowledge. We are asking how we can develop those mindsets in an engineering program to address issues such as design justice. How can we enable student agency to explore challenging problems in the world? How can they expand their horizons, allowing them to see alternative futures for their careers?”

Questioning Assumptions
According to Evan Peck, Bucknell technology and engineering students should develop a healthy skepticism and question their designs.

“Students should ask themselves, what are the possible unseen implications of the things that they’re building,” says Peck. “We can’t give them answers. In many cases, there may not be good answers. But it’s important that they realize how design decisions might have trade-offs and unintended consequences.”

Students should consider how bias in design can spread human error further and deeper than designers ever imagined. “Technology can be a powerful amplifier,” says Peck. “When we built social networks, we amplified our ability to share our lives with people by making it easier to access each other through pictures, videos and messages. But we also amplified our ability to bully, harass and stalk each other — a burden that is not felt equally across genders. So, we need to carefully consider the trade-offs of technology and who is impacted.”

Training in design and calculations alone may not be enough to address issues of justice. “When our programs and our code intersect with other cultures or large groups of people, we need to step outside of the computer science and engineering classrooms and understand more of the world,” says Peck. “This is a benefit of Bucknell, where students take classes in the social sciences and the arts and the humanities, where they learn and understand more about how different people are advantaged or disadvantaged. If we can encourage students to pause and think about that, it’s a start.”