Building for Intersectionality in Tech: Intersectionality and Design

Nicole Thayer ·

By Nicole Thayer & Tiffany Wong

Note: This is the second post in this blog series on intersectionality in tech. Check out the first, third, and fourth posts.

Welcome back to our ongoing series about how employees at Carbon Five are approaching conversations about intersectionality in tech. For more information on why we’re doing this and what we’re hoping to accomplish with the series, read the introductory post.

Today we’ll be talking about systemic bias in design. Product designers are often asked to create work for a target audience with needs that are different from their own. When we build software based only on our own experiences and needs, we miss huge problems and risk further marginalizing people who are already not well served by tech. How can designers create more inclusive products?

In interviews with our peers at Carbon Five, topics that came up included designing for neurodiversity and combating confirmation bias in our design work. For this post, we talked to Treyce Meredith, a former Carbon Five employee who’s currently working at OpenTable, and Suzanna Smith, Director of Design at Carbon Five.

Designing for Neurodiversity

“Anything your brain does that doesn’t have to do with your body is neurodiverse,” said Meredith.

This can include things like reading, writing, seeing color, seeing and understanding emotional communication, and depth perception. Some common diagnoses include ADHD, dyslexia, and autism.

The International Dyslexia Association estimates that approximately 6-7% of students nationwide have a learning disability, and as many as 15-20% of the population has some symptoms of dyslexia.

Though accessibility has become a buzzword for designers in recent years, Meredith said tech isn’t doing a great job at meeting the needs of neurodiverse or disabled people.

“I wanted to design for and work with people with learning disabilities because it was something I was involved and interested in, but I didn’t see it represented in the world,” said Meredith, who has dysgraphia and dyslexia, which affects how he processes words.

Carbon Five is hiring

Accessibility isn’t accommodated

Creating products that are more accessible starts with accommodating and welcoming neurodiverse employees.

“I think tech has a ton of people [like this], but since the world isn’t super good at talking about it, people don’t have the support they need,” Meredith explained.

He recalled experiences in which he was denied accommodations in his academic career and in applying for jobs.

“There seems to be a gap in inclusivity in recruiting,” he said. “Some don’t know that learning disabilities have the same protections as more visible forms of disability.”

On a product level, Meredith has personal experience with products that haven’t prioritized accessibility, particularly products that rely on correct spelling.

“On a search page where there are no results, there’s a bulleted list of suggestions that says ‘check your spelling’ and shows you an ad,” he recalled. “So that’s pretty terrible.”

Accessibility isn’t prioritized

How do we end up with products that create such bad experiences for neurodiverse people? Smith said the problem starts when we don’t spend enough time with our audience.

“Because we are consultants, we often have time constraints placed on us that are out of our control,” she said. “It’s sometimes hard to do the kind of authentic user research that would allow us to truly reach and connect with user groups that we aren’t immediately familiar with.”

Both designers mentioned that prioritizing features that accommodate a wider range of users was difficult in an agile “just in time design” culture.

“I advocate for design features and try to adjust problems in a way that is more inclusive. That often leads to features having a bigger scope,” said Meredith.

He told us a story about advocating for a more typo-tolerant search field in a product with the hope of reducing errors for people with dyslexia.

“I got a lot of pushback from engineers who thought it increased the scope too much and the product didn’t need that level of definition,” Meredith recalled.“If the world just works perfectly for you, it’s harder to see —it makes me address my biases a lot in those moments.”

So what do we do?

“To truly avoid the risk of making wrong assumptions or falling into a trap of hubris around thinking we know what is best, we have to put ourselves in the humble position of being complete novices before a certain user group,” said Smith.

Both designers suggested being willing to have difficult conversations, advocating for candid user interviews, and running prototype tests where participants can be vulnerable and talk honestly about difficulties in using the product.

“Discussing accessibility requirements as a priority early on with product owners and stakeholders is key to ensure that time is protected for evaluating potential solutions,” said Smith.

How do you incorporate designing for neurodiversity into your own practice? An attitude of humility is important, as well as a willingness to advocate for features that prioritize accessibility over speed of development. Consider using a Figma plugin like Able to check for visual accessibility, and make sure to interview users with learning disabilities in your user tests—and make sure their feedback makes it into the final product.

Fighting Confirmation Bias in Design

Accessibility is only one of many ways in which product design fails to include people who are already at a disadvantage. It’s an intersection that can be magnified for folks who belong to other groups that face discrimination.

How do comfortable, connected tech workers design for people living in poverty? How about people living in rural areas or with bad reception? What about designing for people with ethnicities and genders different from our own?

One of the ways tech passively excludes marginalized people is by not having employees on staff who can even identify these intersections as underserved. Without a diverse range of lived experiences, we can fail huge swaths of people and never even notice.

Confirmation bias, in this context, is the tendency for a person or a team to interpret new evidence as proof of one’s existing beliefs. The first step to combating confirmation bias is to understand the limitations of our own lived experience.

“Probably the biggest risk is making incorrect and uninformed assumptions about what other audiences need, said Smith. “Those assumptions might be well-intentioned, and probably happen often when we try to put our previous experience into practice, which is something we are prone to do as consultants when we’re under pressure to move quickly and make assertive recommendations.”

Meredith mentioned fonts designed to increase legibility for dyslexic people as an example.

“Dyslexia has nothing to do with the shape of a letter — it has to do with changing a symbol to a sound,” he said. “These people are well intentioned and tried to help, but in actuality because they didn’t understand how dyslexia works, all that resulted was they made an ugly font.”

Checking privilege as part of process

“I know what helps me, but not what other groups tend to struggle with,” said Meredith.

Pausing to address what you assume to be true can be a great way of getting implicit bias out in the open and to create an opportunity — and an expectation — that that bias will be challenged.

“One thing I’ve done in the past is try to articulate and externalize a team’s assumptions about what they think they’ll see or hear in a user test or interview, before those activities take place,” said Smith. “This creates something a team can reference to check whether any “brilliant insights!” aren’t just conveniently supporting things we thought we’d see. It gives us a tool to make the concept of confirmation bias more concrete.”

Have the conversation

A sincere reckoning with systemic bias in tech may require some adjustments. After all, relying on existing assumptions about who our users are and what they need feels efficient and can be done quickly where a deeper understanding takes more time. Meeting a broader range of needs may result in pushing back milestones or having difficult conversations with the team.

Meredith mentioned a past project that entailed creating a booking system for a product with a two-way marketplace (think ride sharing). As a side effect, it put women and nonbinary people in situations where they felt unsafe.

“We learned that women using the platform felt more comfortable using the service when their host was a woman or had good reviews from a woman,” Meredith said. “One thing that we proposed was to call out a set of statistics — how many women had reviewed [a user] — and adding a set of filters that helped you find the  information that you wanted.”

In order to reduce the risk of violence against marginalized communities, the group tested extensively with people with a broad range of gender and gender presentation.

“We also learned that LGBT people felt more comfortable if a man had good reviews from women,” he said.

However, the team found that not all groups responded positively to this feature.

“The majority of men who tested the prototype were against the concept,” he said.”The team tested the feature and found that they could potentially lose a lot of customers, but the PM advocated for building it anyway.”

Representing the voice of marginalized groups during the design process isn’t always easy.

“A lot of companies would not have wanted to do that. There was no quantifiable improvement from this feature,” said Meredith. “I thought it was really cool.”

Co-Ownership and Co-Creation

The most effective ways to combat bias in tech are simple to recommend and difficult to implement: build more diverse teams, involve marginalized users in the act of creation, and create conditions to receive honest feedback. That’s why prototyping and user interviews are key to the process of challenging assumptions and building for a wider range of users.

“Involving [users] in multiple rounds of feedback, on designs that evolve to reflect their input at every step, goes a long way towards building trust,” said Smith.

Ultimately, we have a lot of power in the design process — power we can use to better represent people with less privilege than us. Smith recommends “humility, time and patience, open-ended questions, lots of listening and paying extra attention to empathy.”

Keep listening, and good luck!

Additional resources

Equity-Centered Community Design (ECCD)

Co-design: A Powerful Force for Creativity and Collaboration

What’s next in the series?

Up next, we’ll learn about how software engineers at Carbon Five consider intersectionality in their work.

Click here for more information on Carbon Five’s internal efforts around diversity, equity, inclusion, and belonging (DEIB). If you’re in the tech industry and interested in exploring DEIB issues, we’d love to work with you! We’re always taking on new projects and hiring folks who are interested in making the tech industry a better place to work.

 


 

Carbon Five is hiring

We’re hiring! Looking for software engineers, product managers, and designers to join our teams in SF, LA, NYC, CHA.

Learn more and apply at www.carbonfive.com/careers