S1:E5
Todd Libby: [00:00:00] Welcome to the Front End Nerdier—Nerdery Podcast, a podcast about front end development and design. I'll be talking with people in tech about a number of different topics focused on front end development, design, and other issues as well. I'm your host, Todd Libby. Today's show is another break from the front end discussion that I usually have.
And it's something that I talk briefly—briefly with in the last episode with my guest and that is bias. Well, I'm here today with someone that has written a book about bias. And so I have with me, author, speaker, and break dance contest winner my guest today, David Dylan Thomas. David, how are you today?
David Dylan Thomas: [00:00:46] Good. How are you?
Todd Libby: [00:00:47] I'm doing well. Thank you. So why don't you tell the listeners a little bit about yourself?
David Dylan Thomas: [00:00:53] Well, first and foremost I won that contest when I was 22, because I knew one break-dancing move and the rest of the contestants knew zero breakdancing moves. So, I'm not an expert, but I do have that one move.
So now that that's out of the way. So yeah, I am an author, a workshop giver, talk giver, I have just recently incorporated and become David Dylan Thomas, LLC. So, that is my full-time gig now is giving talks and doing workshops that basically to get people excited about and give people tools for inclusive design.
A lot of that is based on the work in my book, which is—called Design For Cognitive Bias and as the title implies, it's trying to intersect those two worlds of UX and front end design and what I know about that from my day job doing UX and content strategy for 20 years, and then sort of my aggressive hobby of studying cognitive bias to the point of doing a hundred episodes of a podcast about it over the past four or five years. But yeah, that’s me.
Todd Libby: [00:01:51] Awesome. Awesome. Yeah and I, and I'll hold up the book for the people on that watch it on YouTube. There it is right there.
[Holds up the Design for Cognitive Bias book]
David Dylan Thomas: [00:02:00] I hope I'm pointing in the right direction.
Todd Libby: [00:02:02] Yeah.
David Dylan Thomas: [00:02:02] I have no idea how it will show up on YouTube.
Todd Libby: [00:02:07] Yeah, I can't recommend the book enough. I was actually surprised at how many biases were in the book and I'll get to that in a second. But first I like to ask my guests how did you get started in your web development/design journey?
David Dylan Thomas: [00:02:14] Sure. So, I think my first touch of the web was in 2000. I was coming out of college with a writing seminars degree and really my background is in filmmaking. So—I have the, sort of the content side of content strategy down from that.
And you know, writing stories, doing movies, and for about four years, kind of bounced around like working in a record store, like trying to make that writing degree work. Eventually though, in 2000 I found a career in online, it was sort of the dawn of online education and we were getting a lot of courses.
There was, Center for Talented Youth was a course out of Johns Hopkins University that offered—college level writing courses to kids who tested really well for writing coming out of like junior high and, and high school. So that was delivered on a CD-ROM, and then they would you know, turn in their assignments to really an online forum.
And that format has weirdly not changed much at all in 20, 30 years. But but yeah, that was basically, they would workshop their work in the forum. And so that was really my first taste of the web and how, you know, the web could allow people from all over the world who would otherwise never have met each other and may never meet each other to interact and learn from each other.
So, I was like, okay, more of that, please. So, that really got me interested in how, you know, content strategy and UX design can kind of affect those relationships. And then over the next, you know, 20 or so years, I had different jobs that didn’t have the title of content strategy or UX, but was kind of that anyway.
So, I was like the online editor and chief of like four magazines that were prints but had websites. So, there's a lot of content strategy involved there. There was a lot of, I was like in a nonprofit working as a, I think director of like, not communications, but some—highfalutin title that had to do with content.
But all of that exposed me to these different ways that content and UX could affect how people receive information and communicate with each other. So, all of that eventually led to me having a job where my business card actually read content strategist at a company called EPAM. So that put me on the agency side.
So, I did a lot of that, that got me a lot of consulting experience. And really that's when I got deeper into UX as well as content strategy. And then from there I moved on to Think Company and from Think Company and I've moved on to a solo practice.
Todd Libby: [00:04:26] Nice. Nice. Yeah, content strategy is something that I've been—interested in more and more and more as I, as I read books like yours and you know, people talking about it at conferences.
So, it, regarding the book, you know, as I said, it's terrific. I really enjoyed it. Can you tell the folks about the book and what drove you to write the book?
David Dylan Thomas: [00:04:46] Sure. So, the book is really about how, you know, as designers and developers, our job is in large part to help people make decisions, right?
That's what our products and services do in a large part. And the more we understand how people actually make decisions, the better we'll be at our jobs. And bias which is to say people are making decisions so quickly they don't even realize they've made them, is a large part, something 95% of how people make decisions.
So—there's a bit of a—understanding how our users are making decision and how our design choices can influence those decisions. There's a bit of working with our stakeholders and understanding how their biases, right, are something we have to learn how to navigate, whether it's our bosses or our clients.
And then finally, in the last chapter, I really try to dig in on our biases as designers and developers and how those are kind of the most dangerous ones, because those are the ones we don't even realize we have. And it's easy for us to bake those biases into our products and services.
So, there's a lot of time spent there thinking about what are the processes where you can institute in our organizations to sort of QA our choices. Right? And, and be more inclusive in our approach. I mean, and it's, it's kind of like, it's, it's kind of a Trojan horse for talking about design ethics at the end of the day.
That's really what the, what the book builds to. As far as how I got into that I once saw this talk by Iris Bohnet called "Gender Equality By Design".
And she was really getting into this idea that a lot of implicit racial and gender bias really comes back to pattern recognition. So, if you're hiring a web developer, you might—when I say those words, 'web developer', the image that might just pop into your head might be 'skinny white guy'.
And it's not because you explicitly believe that men are better at programming than women for our performance, but the pattern that you may have seen growing up in movies and television, or even some offices that you might've worked in might establish that.
And so, if you see my name at the top of a resume, that doesn't fit that pattern, all of a sudden you're giving that resume the side-eye and we've seen this experimentally to be true. So, when I saw that something as terrible as racial or gender bias could sometimes come down to something as basic and dare I say human as pattern recognition.
I'm like, okay, I need to learn everything I can about cognitive bias. And that's basically what I did. I went on a mission to learn everything I could, eventually that turns into the podcast and eventually that podcast got mixed with what I was doing in my day job in, in UX and content strategy to say, "Oh, these two things intersect and a lot of places." And I started giving talks about that and that then eventually led to the book.
Todd Libby: [00:07:09] Yeah. And the, and the podcasts I listened to the podcasts from beginning to end and it was great. I learned a lot. I didn't know, there was so many different biases.
David Dylan Thomas: [00:07:18] Oh, and we're still discovering new ones. If you, if you, by the way, if you like that, I gotta recommend David McRaney's, "You Are Not So Smart Podcast" cause he's like on the forefront finding even newer ones.
Todd Libby: [00:07:33] Alright. So, he—here's the question that I have, you know, I know biases and like I'm amazed at how many were in the book and, and, and, and everything from what I've heard you talk about when you've spoke.
And, and I, again, know bias is something like, "Oh, I prefer the Celtics over the Lakers," or, you know, this, "This election was rigged because Fox News said so."
Can you tell listeners, you know, getting down to basics, what biases and a couple examples of, of what we see? I don't know if I can say prominently on the web or, you know, definitely, you know, that are, you know, out there.
David Dylan Thomas: [00:08:14] Sure. So bias is really just a fancy word for a shortcut gone wrong. So, your mind is taking short cuts all the time, because we have to make something like a trillion decisions every day.
Even right now, I'm making assumptions about how fast to talk or what to do with my hands or where to look. And if I thought carefully about every single one of those decisions, I'd never get anything done.
So, it's actually a good thing that a lot of our lives is just on autopilot. Like I said, something like 95% of your decisions are happening below the threshold of conscious thought. It's like you're just going, right? Sometimes though the autopilot gets it wrong and we call those errors, cognitive biases.
So, like a fairly innocent one I like to bring up is illusion of control and it comes up if you have to roll a die. If playing a game, you have to roll a die. If you want a high number, you tend to roll the die really hard.
And if you need a lower number, you tend to roll it really gently. And if you think about it for two seconds, that doesn't make any sense, but in situations where we don't have control, we like to feel like we have control.
And so, we invited that by how we rolled the die. That's a fairly harmless one, but the ones, you know, in the book that I really try to focus, other ones that could cause harm.
And so, you know, confirmation bias, like what you said about the rigged election, right? That comes from this place of, I have an idea in my head and I'm only going to look for evidence that confirms the idea.
And if I see evidence that doesn't confirm the idea. I'm the old fake news and move on. And I think, you know, the election is unfortunately a perfect example, right? You have people who are committed to that idea.
And any evidence to the contrary is really only going to convince the more that their site is right and I can guarantee you like 14, 15 years from now, you're still going to have lots and lots of people who believe that Trump won that election.
Todd Libby: [00:09:43] Yep. Yep. Now you mentioned, you know, harm, harmful biases like confirmation bias. And, and in the book you mentioned the most dangerous one being the framing effect, but that also has a positive side, which—also, you know, maybe look at the non-harmful ones we see. What are some of those non-harmful ones? You just mentioned one of them.
David Dylan Thomas: [00:10:08] Sure. So, I think that, I think that it's not even necessarily like grouping into harmful and non-harmful so much as once you know, about the bias, how are you using it?
So for example, there is a bias around this thing called cognitive fluency, which has to do with how things appear to us, like the visual design of them, how easy they are to process and understand so clear language, you know, stuff like that.
Even if something rhymes can affect how believable that thing is. So, the easier it is to process, whether that means it rhymes or uses plain language, or it's visually very clear makes us, makes it more believable to us. So, if you know that now you're responsible for what you make visually clear, what you make rhyme, like it better be true.
Cause I'm going to think it's more true. So, to me, it's almost more a matter of saying, "Okay, once you know what that human weakness is, what that human tendency is, you know, are, is the way that you're designing, you know, pointing that in a good direction or a more destructive direction?"
Framing effect is a great example because the framing effect is basically, you know, an example would be, if I—go into a store and see a sign that says like 'beef 95% lean' and next to it there's a sign that says 'beef 5% fat'.
You know I can guess which one people are going to line up for, but it's the same thing. Like I'm ti—saying the exact same thing, but I've framed one of those decisions in a way that makes it more appealing.
And you know, how you use framing, like the example I kind of go to is sort of saying should we go to war in April or should we go to war in May? Right? Like that's a framing that skips over the whole, should we really go to war—part. Right?
And that can be, you know, exceedingly dangerous and it is like it's used all the time in politics, it gets used all the time in design to sort of frame things in a way where you'll make a decision that might be against your, your, your best interest.
So, at the same time though, I can frame things in a more positive way to, you know, go from something that causes conflict to something that causes collaboration.
So, one of my favorite examples of this is, if you can imagine, a photo of a senior citizen behind the wheel of a car, if I showed that photo to one audience and ask them, "Should that person drive that car?" I'm going to get a conflict, right?
I'm going to get a policy discussion where some people are going to be yelling about how old people are bad at everything, and don't let them drive.
And other people are going to be yelling, "That's ageist, let people do what they want." And all I'm going to know by the end of that conversation is who's on what side. If I show that exact same photo to another audience and ask, "How might this person drive this car?"
What I end up with is a design discussion, right? Now we're collaborating and saying, "Well, what if we changed the shape of the steering wheel? What if we moved the dashboard?" Right?
And I’ve only changed two words, but I've totally changed whether or not we're fighting or whether or not we're collaborating. And that's a positive use of the framing effect.
So long story short, most of these biases aren't inherently evil. It's more, once you are aware that this is a shortcut, the mind of your user is likely to take. How are you taking that into account when you make the thing?
Todd Libby: [00:12:49] Yeah. Yeah, and you know we—the, the—when you mentioned resumes you know, that, that popped to mind, you know, I was looking for the long—and, you know, throwing out resumes left and right. And I'm thinking about, well, you know, that's why they have—that's, that's ageism or, you know, with, and especially with like you said the skinny white guy thing.
I think, you know, that's a, that's a prominent way of tech being the way it is, is, you know, the side eye that you mentioned when somebody hiring sees, you know, you know the name of this person as compared to that person, if they're not a skinny white guy or, you know, even a white guy, older, white guy, like me, for instance.
So yeah, th—th—the, the book talks about, and I got to open it up here. And—one thing that really—towards the end—well, the, the part leaving good design on the table. When you came back to confirmation bias—can you go into what you meant by leaving good design on the table?
David Dylan Thomas: [00:14:05] Sure. So, you know, as designers, we love it when we come up with a good idea and we love it even more when we come up with a good idea of what the client likes and if that happens, that's, we're done, right?
That's, that's going to be what ships barring any, you know, barring any accidents. But, the only problem with that is it is entirely possible you've left a better solution on the table because you didn't deliberately ask, "Is there a better solution?" Right?
And so the example I like to give is if you kind of picture a computer game where you see the numbers, two, four, six, question mark, right?
That's on the screen. And the computer says, "Okay, put whatever number you want, where that question mark is. I'll tell you if it fits the pattern, put in as many numbers as you like and when you're ready, tell me what you think the pattern is." So, the obvious choice, right? Is to go, two four, six. Well put eight where the question mark is, right?
Two, four, six, eight. Okay. And the computer's like, "Good choice that fits the pattern. Would you like to try another number?" And most people stop at that point? I would, right? I've actually taking this test. I stopped at that point, I'm like, "Nope, I got it. The answer is even numbers. That's the pattern."
And the computer is like, "Nope." And then what I realized is I was wrong because I never tried putting in two, four, six, seven. That also fits the pattern because the pattern isn't even numbers. The pattern is every number is higher. One is just higher than the number that came before it, which is a much more elegant solution.
Probably easier to code, probably cheaper to build, but I never got there cause I was like, "No, that's my even numbers idea. That's a great idea. Let's just stick with that."
So—if we want to get to the best or at least better design, we have to actively question. It's the opposite of confirmation bias. You have to say, "No, wait, this thing, I really, really believe to be true."
I have to actively question now and I have to say, "Well, if I'm wrong, what else might be true? Let me go and try and prove that." Right?
And that's, that's where I could have gone with two, four, six, eight. I could have said, "Okay. I think it's even numbers, but if I'm wrong, what else would be true? Well, an odd number would work. Let me try an odd number." right?
And then I would have pretty quickly gotten to, "Oh, it's just they're, they're just higher numbers." right?
Todd Libby: [00:16:03] Yeah. I was looking at that and I did the same thing. I stopped right at eight and I said, Oh, you know—
David Dylan Thomas: [00:16:09] And it makes sense too, because we live in a world that's driven by budgets and deadlines.
Right. So, we're rewarded for getting to a good answer quickly, right?
Or a good enough answer quickly. You know and in some design that's fine but, in other design where it's like, you know, people's paychecks, design that's going to affect whether or not someone gets, you know, housing or food, whether it's like, there's all sorts of like major decisions that depends on good design.
And if we just stop at good enough design, people could get hurt.
Todd Libby: [00:16:38] Right. Yeah, yeah. And—the, the other thing towards the end of the book that really intrigued me was the Black Mirror rule. Can you go into that a little bit?
David Dylan Thomas: [00:16:48] Yeah—so, there's this thing called speculative design and it's, it's, it's a good at helping us think through unhappy paths.
Like again, it has to do with the way that we sell design, right? We don't, nobody sells design based on, “Hey, here's all the flaws that we're going to run into with the design.”
No, it's “Hey, I'm going to redo your website. People are going to find the products faster. They're going to hit the buy button more often. Like here's all these cool—it's going to be beautiful and like shiny and all like your competitors." and we sell on the, on the good features on their happy path.
The truth is no matter how happy the path there are always going to be more bad things that can happen then good things. Right?
And so, you should be spending at least as much time designing for those unhappy paths, right? All the error states, all the things that can go wrong. Right? And as humans we tend to not like to do that, right?
Todd Libby: [00:17:36] Yeah.
David Dylan Thomas: [00:17:36] And so, a way to kind of ease us into that is this thing called speculative design and I call it the Black Mirror exercise where—I mean, lots of people call it that I don't wanna take credit.
But so Black Mirror, if you haven't seen it is a TV show where it's kind of like Twilight Zone for tech, like you take some near future technology and you tell a story about what would happen if real human beings got their hands on it.
And it's always awful. Right? And—you know, I like to say that anybody working on a new technology by law should have to write a Black Mirror episode about it. But—but, it's a great way to sort of take what may seem like an innocuous, harmless technology idea, and then almost—it's almost like a fun way to tell a story and like, say, "Well, what could go wrong?"
Right? What would happen if a white supremacists got their hands on this? What would happen if a misogynist got their hands on this? How might someone trying to make money off of this that I haven't anticipated, right? And you end up with these narratives that are like, "Oh, oh yeah, that would actually be super easy. I don't know why didn't I see that." Right?
Like, I think a lot about—I was a techno optimist back in like the early 2000's, right? And when Facebook was coming up and Twitter, I'm like, "Oh, this is going to connect everybody and all these underprivileged people are going to have a voice now and creativity."
I was a filmmaker. I was like, "I can get my films out there now because YouTube is there." Like all this empowerment is going to happen. And I was thinking about the happy path. And if I, for two seconds, like maybe if Black Mirror had existed then, wait for two seconds I had said, "Well, let me tell a story about how this technology might go wrong."
I wouldn't be too many chess moves away from Cambridge Analytica. Right? That would not have been that hard to imagine, but I just didn't point my imagination in that direction.
And it's so sad because I know people who back in the day, tried to get funding, to study the potential impact of like knowledge tribes on social media and how that could possibly not be a great thing.
David Dylan Thomas: [00:19:18] And they, they couldn't get their funding.
Todd Libby: [00:19:20] Oh, wow.
David Dylan Thomas: [00:19:20] And it's like, wow. Like if they waited two more years, they would've gotten their funding.
Todd Libby: [00:19:24] Yeah. Yeah. Yeah. So how can designers and even developers—I was, I've been thinking about this a lot lately, use bias to work for them and to be non-harmful if at all, to benefit what we do?
David Dylan Thomas: [00:19:49] I mean, I think that developers in a way have a little bit of a head start because developers are used to the idea of QA and that's really all I'm advocating for when I kind of talk about things like Black Mirror or—or red team, blue team is rather than just assume that you've landed on the correct solution of usability tested, right?
Go through and see, are there ethical bugs in this thing that we didn't anticipate. And in the world of web development, there's no, there's no ego or pride that's so big as to say, my code has zero bugs. I defy you to find, you know, bugs in my code. It's like, no, we hire people. We paid them very well to go through and check because of course there's bugs.
Like that's the nature of coding is to try something iteratively say, "Oh, that's not working. Let's go see if I can figure out what what's not working. Okay, try it again. Try it again. Try it again." Like that—because of the nature of it we're used to the idea that no, it's not going to be perfect on day one, we're going to have to work at it.
I am advocating for the same approach in design and content to say, you're not going to get this right out of the gate. And it's not just a matter of, you know, functionally, you're not going to get it right out of the gate or use it. It's no, ethically, you're not going to get it right out of the gate because you're not an expert in ethics.
Right? Or in terms of the impact it has on people who don't look and act like you. You're not, you're not an expert in that. So, you're going to have to bring some of those people in to critique it, right? And find those bugs. So that to me is really, I think the best advantage. Or the best attitude to kind of take with the stuff.
Cause the other piece of it is you don't want it to be about you design something that's not inclusive. You're a bad person so much as the more likely scenario, because I don't think you woke up in the morning and said, "I want to hurt black people."
I think the more likely scenario is. You woke up in the morning, knowing nothing about the Black experience and you designed something for your experience and it happens to be harmful to, you know, this person that you don't understand, you've never met, you have no connection to.
So it's about making those connections to saying let's bring people in. I talked in the book about this thing called red team, blue team. Which basically, the blue team goes through and does the work of any design team.
They do the research, maybe you get as far as the wireframe, but before you really commit a red team comes in for one day and goes to work with the blue team and looks for those holes.
And it's their job to, right? There's no judgment. And it's like, I came here to QA your thing. And ideally that red team has some people who might be affected by this design, but for whom you have no real context. So I think that's kind of like where I think that I think those are kind of the best approaches to deal with that.
Todd Libby: [00:22:21] Right.
So, on the topic of inclusive design, because it's tied in with a little bit of what I do. And what I've done, which is accessibility and, and being inclusive in that space. How important is inclusive design and the inclusive design practices in getting those into a budget for projects?
David Dylan Thomas: [00:22:47] Oh, I mean, I, I like to say that I'm out to win hearts and minds and budgets, and if it doesn't make it into the budget, it's probably not going to happen.
I think most of us work in an environment where if it's not in the budget, it doesn't exist. Right? Cause if it's not in the budget, it's not going to make it to the project plan. It doesn't make it to the project plan it's not going to make it into the sprint and so on and so on. And so, I think it's critical that inclusive design practices like red team blue team or assumption audits become not just part of the budget, but part of the budget template.
So, where I, we used to work, if we were going to start a project, the very first thing we did is whip out estimating Excel spreadsheet that already had a bunch of different line items in it because typically, those are expected to happen on a project.
And each one of those would have like a little formula on the second sheet that was like, okay, here's so much we charge for a content audit here so much we charge for a here's how many hours it takes to do like, you know a—a, a responsive website or blah, blah, blah, blah, right?
Todd Libby: [00:23:41] Right.
David Dylan Thomas: [00:23:41] And I want that same—I want, and the thing is when you have it in the template, nobody questions it. You open it up and just, you basically, it has to be it's, it's innocent until proven guilty. Like you have to prove that it doesn't belong there. Right? Once it's in there, it's probably going to stay.
And so, if an assumption audit, which is this exercise where you get a bunch of people in the room who are going to work on the project and you have them kind of question their own biases and talk about who's missing and how you might, you know, advocate for getting more of those people in the process, that's like a two hour meeting.
So, if I go in and say, "Hey, I just want to add this two hour meeting to this next project." if it's a large enough project and it will just sort of disappear into the hundred, 200, 300 hours of the project—I could probably get that in. And if I can do that a few more times, maybe that becomes part of the budget template.
And then I can go back and say, "Well, you know what? We already do assumption audits every time. Can I get a red team?" And that just takes a day. For this, you know, it's a 90 day project can I just get one day for a red team? Right?
And I'm slowly escalating there and that's taking advantage of this thing called consistency bias where if I come in and I say, "Hey, I'm going to put a big, giant sign on your lawn." you're going to say no, but if I say, "Hey, can I put this tiny little, you're not even going to notice it." You're more likely to say yes.
If I come back two weeks later and I say, "Hey, can I put a giant sign on your lawn?" And you're actually more likely to say yes, because you're sign people now.
Right? So, I'm starting with a small thing, this little assumption audit. It's like two hours who cares? that I'm like, "Can I get a day?" Right? And before you know it, we have all these things in the budget standard issue cause hey, we're inclusive design people now.
Todd Libby: [00:25:11] Yeah. Yeah, that's it. That that reminds me of, you know, fitting accessibility in from the get-go and advocating at—
David Dylan Thomas: [00:25:19] Yeah.
Todd Libby: [00:25:20] the very beginning and, and making sure that hopefully that's included in the budget as well. And then—
David Dylan Thomas: [00:25:27] Yeah, and that, I got to say, I'm very proud of where I used to work. I was at Think Company before this and our SOW's, I forget what the level was, but it was not the bottom, but there was a level of accessibility standard that was just in the standard SOW.
If you wanted less tough, we're giving you this, we're giving you this, this particular standard. No, just no questions asked. Like, that's just how it is. And that, that to me is what I want to see inclusive design get to, like, we don't literally have standards for it yet, but, but I want it to be this sort of, "No, of course, of course the website's gotta be accessible."
Like, and part of that, honestly, there's regulations. Like, do you want to get sued? No, the website is going to be accessible. Right? But that, that, wasn't always that way. And I think inclusive design now is where accessibility was maybe twenty years ago.
Todd Libby: [00:26:08] Right, yeah. So—you know, what, if a company comes back, okay, let's say like, I've seen many times with accessibility and stuff, so we need to fit accessibility and whether they're being sued or not, most of the time it's because they're being sued.
David Dylan Thomas: [00:26:29] Yeah.
Todd Libby: [00:26:30] Have you had companies come back and, and say, "Okay, this, you know, we'd like to get them inclus—some sort of inclusive design." Or some sort of—
David Dylan Thomas: [00:26:42] Sprinkle some inclusivity dust on this thing.
Todd Libby: [00:26:44] Yeah.
David Dylan Thomas: [00:26:45] Yeah. I mean, and that's the way with any emerging—excuse me, any emerging practice, right? It starts out as this thing where it's like, okay, I've heard—I mean, I went through this journey with content strategy. Right?
I was starting working really deeply in content strategy around 2010 before, really people are sort of starting to call it something and you had to struggle. Like I remember at that time I was trying to find a job in content strategy and the only content strategists were freelancers.
There was no agency that had it as position, there was no company. Nowadays, you know, experienced design firm worth its salt, they've got at least one, if not a whole slew of content strategists, right? And most companies that are large enough have something called a content design or content strategy or something like that.
It's like standard issue now, right? But that took ten years, right?
Todd Libby: [00:27:29]
David Dylan Thomas: [00:27:29] So like, and, and, and, and in those early days it was okay, can we throw some content strategy in, in the last sprint, you know, or the last week of this and like this, every content strategist is listening, giving this collective sigh, like, "Oh my God."
You know—when in fact, like it only really works well, if you start at the beginning and the same thing with accessibility, I mean, like there was this point, it was like, "Oh yeah, we'll just throw some accessibility. We'll, we'll, we'll make it part of the QA." Right? And it's like, oh my God, if they need to be there at the beginning, like everybody—that's the thing it's like, when does X need to be involved in a project at the beginning?
It's always the beginning. Don't—just, it's always at the beginning. So yeah, I mean, I think that's, I think that's very much how we're dealing with inclusive design now.
Like I've had clients who were like, "Okay, we're almost at the end of this project, but this inclusive design thing sounds interesting. Can we, can we throw some of that together?" And if you read the book, like almost every everything I'm advocating for you do before you even begin the projects.
Todd Libby: [] Yeah.
David Dylan Thomas: [] So, it's like, "Okay, we'll do our best with what you've got." But this—by then, you've already narrowed your options, right?
Like with any, again, with any kind of practice that you want to include by the time you get to that final, you know, month you're, your chess—you're out of chess moves. Like you're already locked in to so many things and it's so expensive to go back and change those things. So you try to bolt on accessibility, bolt on content strategy, bolt on inclusivity.
It's like, "Eh!" and inevitably a year later it's like, "Okay, let's go back and redo the whole thing."
Todd Libby: [00:28:55] Yeah. Is. So again, I kind of tend to go back to the accessibility theme. With accessibility, it's got to go beyond the finish of the project. Is inclusive design also something that goes beyond the finish of the project?
David Dylan Thomas: [00:29:15] Yeah, I mean, inclusivity is a state of mind just like accessibility, right? Like, one of the other things I really like about how accessibility has evolved is it's both an expertise and a commonality, right?
Everybody on the team is expected to have accessibility in mind. And maybe even know a thing or two about how it plays out in their role, but you also have someone on the team who really deeply understands accessibility and can be really, you know, involved in guiding that part of the project, same thing with inclusivity.
I want to get to the point where everyone kind of deeply under, you know, everyone has like an awareness of it and maybe which biases to look out for in their particular swim lane. But there's also people. I mean, I honestly think of it more as expanding what we think of as a typical design team.
So yes, front end and back end dev might be there You're going to have—content strategists, UX designer, UX researcher, visual designer, project manager, like all these, you know, roles we’re used to seeing, but maybe there's also historian. Maybe there's also an ethicist, maybe there's someone who understands, you know, behavioral design and, and, and you know, behavioral economics and stuff like that.
Like those pieces of the puzzle, I think are critical as well, because most of the stuff we're building, you know, it doesn't just have to work on a device. Right? Like that to me is like the lowest standard of something working because that device exists in a society. Right? And when you scale how that thing plays out, you get to things like Cambridge Analytica, the 2016 elections, the 2020 elections, right?
Like, all of these things happen because that technology exists in a society. And I think, does it work in that society or environmental impacts now. Right? It's sort of like, okay, but what does that technology do to the environment? That's another stakeholder to consider, right?
Todd Libby: [00:30:55] Right.
David Dylan Thomas: [00:30:56] I think it's that bigger lens where, "Oh, well, if that's the lens, we need more than just people who understand code."
I mean, it's great that you understand code. I don't understand code so good on ya, but, but that's not all it does. It, it, isn't just about the code at that point. It's about how is that technology operating in a larger sphere? I mean, a perfect example—and another discipline is biology, right? Bioethics.
Right. If you're working in CRISPR, right? Which is this gene editing technology, like from day one, they had a bioethicist on the team, right? Because we know like, it's, it's, it's, it's obvious to us that that is a very dangerous technology. We want an ethicist on that team, right?
Hopefully by now it's, we're becoming aware of that—technology, like digital technology. Apps and platforms and services are also dangerous. Can also be extremely dangerous. And so hopefully we're getting to the point where it's like, well, of course we want an ethicist on that team.
Todd Libby: [00:31:54] Right. So getting away from what we've been talking about. Today is the start of An Event Apart, I know you're talking on Wednesday, is it?
David Dylan Thomas: [00:32:08] Yep.
Todd Libby: [00:32:09] Yeah. So, by the time this episode comes out that'll be around the first week of May.
David Dylan Thomas: [00:32:16] I hope it went well.
Todd Libby: [00:32:20] I'm sure, I'm sure it will because you know, I've heard you speak a couple of, yeah, a few times actually. You've done one AEA that I know of that I've been at.
You did Barcamp Philly where I listened to you talk. I also did a talk there as well. And you know, you had a couple of, well, w—w—w—where they, for your podcasts, you had a couple zooms. Zoom chats as well with—
David Dylan Thomas: [00:32:43] Yeah, my—the very last season of the Cognitive Bias Podcast is a series of interviews with folks like Mike Monteiro and Erika Hall to sort of talking about ethics and bias in—in—in—in design sort of to help like, launch the book, but so yeah, you might have, you might have caught me on that too.
Todd Libby: [00:32:57] Yeah. So, by the time this comes out, could you tell listeners a bit about your talk and any future talks coming up?
David Dylan Thomas: [00:33:08] Oh, sure. So, I am actually rolling out a relatively new talk. I think I've only given it once or twice before at An Event Apart. And it is called "That's Great But How Do I Convince My Boss?" And it's, it's based very much on the, I think it's the third chapter of my book about stakeholder bias.
And I'm defining stakeholder here is anybody who basically decides how you spend your time. So, it could be a boss, could be a client. But the idea is that they too are making 95% of their decisions below the threshold of conscious thought. And if you don't know what's going on in that 95%, you're gonna have a hard time kind of navigating their biases to kind of like advocate for change.
So, when you see companies just doing like unethical stuff, like, you know Wells Fargo creating 2 million fake users, you know?
Todd Libby: [00:33:50] Yep. Yup.
David Dylan Thomas: [00:33:51] Just because, right? You know? You sort of say, "Huh, how did that happen?" So, a lot of the talk honestly really gets into this idea of moral hazard gameplay, which is basically a fancy word for you have one goal in mind, but the incentives you create actually incentivize something opposite.
So, if I say, “Hey, I'm going to incentivize police officers by rewarding them for number of arrests,” while your goal might be to stop crime but with that incentivization structure of crime ever goes away, they lose their incentive. They can't make arrests if there's no crime. So, it's now in their best interest for there to be more crime?
Right? So, it's stuff like that honestly, that drives a lot of this organizational-like idiocy, I guess is a word. But, but then the rest of the talk is like, here are all these biases we know about that, you know, you can possibly use to your advantage to kind of like guide towards better decisions.
So, I talked a lot about loss aversion, which is the fact that it hurts more to lose something than it feels good to gain something. And larger companies have more to lose and they tend to be more risk averse so we know that if you're trying to guide someone through a risky decision, if you frame it in a way where you're highlighting the, the downside of not taking the risk.
It's actually more powerful than if you frame it as like, "Oh, this is how good it will be if you take the risks." So most people I know are working in a company that some has some kind of legacy software they hate. And if I walk in with this shiny new software and say, "Hey—"
Oh okay, there we go. I walk in and say, "Hey, here's the shiny new software. Don't you want to get it? It's like, look at all these features and we're going to make all this money if you get it."
That's weirdly, not as convincing than if I come in and say, "Hey, this legacy software? Here's how many people are going to quit if we, if, if we stay on it and here's how many, here's how long it's gonna take to find the replacements and here's how long it's going to take to train their replacements and here's how much productivity is going to go down," while all that is happening by the dum—the time I'm done painting that apocalyptic picture, all of a sudden getting new software sounds really, really good.
So, there's a handful of like tricks that sort of play into the biases your stakeholder might have to sort of guide them towards, you know, perhaps a better decision. And that's, that's really what the talk is about.
Todd Libby: [00:35:57] Yeah, that reminded me of a company I worked for that used a very old, archaic system and when I approached stakeholders of—about, you know, "This isn't accessible, we need to make it accessible."
I was met with, "Well, we don't have disabled users." and I said, "Well, that's absolutely wrong." but—you know, not going into the specifics of, you know, situational impairments and things like that.
So, the stakeholder one re—the, the chapter that you, you know, you wrote about stakeholder biases, it brought back a lot of painful memories.
David Dylan Thomas: [00:36:37] Sure, sure. No that's—and to be honest, that's honestly, that's most of the job like let's get real, it does not take six months to build a website. It takes six months to build a website with human beings. The human being part is this, this is like five of those months.
Todd Libby: [00:36:55] Yeah, yeah. So you mentioned that the, at the beginning, your new company. I'm, you know, I'm grabbed your newsletter so, and I kind of get—I wanted to get to this at the end, but being very quizzical, is there a new podcast that you have?
David Dylan Thomas: [00:37:14] Oh yeah, so I—so I, I—I retired the Cognitive Bias Podcast last year. We did our last episode. think in December of 2020, and I am starting a new podcast sometime in 2021. But I love talking to people and I noticed that to have them on the Cognitive Bias Podcast, I had to have some kind of cognitive bias angle, which in some cases was easy, in some cases I kinda had to shoehorn it in there.
And so, I wanted a podcast, so I could just have people on to talk about literally whatever was on their mind. And I have this, I have this event I sometimes do where people come in and they just have a sticker that says, "Lately, I've been thinking about blank." and then fill in the blank and just start chatting.
It's this great, see like, idea exchange. And so I wanted to do a version of that in podcast form and so the, the new podcast is going to be called, "Lately I've Been Thinking About." and I'm going to have people on that I've had wonderful conversations with in the past. I know that if all I need to do is ask them what they've been thinking about, and we're going to have a great conversation. And like, the beauty of it is it's not necessarily going to be like hocking their new book.
I mean, if that's what they've been thinking about, great, but like, you know, if I get Barack Obama on, I'm not going to talk to them about the presidency, I'm going to talk to him—I'm, I'm, the question I'm gonna ask him isn’t, "Hey, what was it like being president?"
The question I asked him is what have you been thinking about lately and whatever, if he's been thinking about knitting, we're talking about knitting for the next hour.
Like that's how it's gonna go and I'm really, I'm really excited by that approach. And other of the conversations we're going to have.
Todd Libby: [00:38:36] Nice. Very nice. Oh, I'm looking forward to that now that you mentioned it. Okay, so we're coming up on, oh, we've got about 15 minutes left, but here's it. I wa—what I like to do towards the end is there's three questions that I have for all my guests and just go through these.
What about the web these days excites you and keeps you excited in what you do?
David Dylan Thomas: [00:39:02] There are, and this is actually—spoiler alert for what I guess will be two weeks ago in this area, but this is actually how I close out this, this new talk. What excites me about the web right now is the fact that we are not alone in our fight for inclusive design.
Every time I give this talk, I find out about another organization, another group that's doing amazing work in this area. You've already got the Design Justice Network, the Center for Humane Design, the Algorithm Justice League, the Tech Workers Coalition.
Like there's all these different groups that are like from their corner of the world, like pulling together and saying, "We're going to band together to fight this thing. We're going to band together to make tech better." And I mean, it's wonderful to see, and it's good to see because like, it's very easy to get overwhelmed. Like the forces that are either deliberately causing harm or thoughtlessly causing harm are really well-funded. Right? You know, because capitalism.
Todd Libby: [00:39:54] Yup.
David Dylan Thomas: [00:39:54] So—so it's easy to get discouraged, but like literally like every time I give this talk someone's like bringing up this or bring up that or, "Hey, have you seen this?", "Hey, have you seen that?" And I'm finding out all these wonderful individuals and wonderful groups that are doing amazing work in this area.
So that, that's what gets me excited right now is seeing all of these different people coming together to like take on different aspects of this beast and, and short of come up with a vision. That's another really exciting thing to me. Isn't just, what are we fighting against? But what are we fighting for?
Like, what is the vision of a just an ethical web and people are really starting to articulate it. So if, a great starting point is if you go to the Design Justice Network's website, they have these ten principles. And if we can get something even close to what those ten principles are talking about, that would be a very just world and a very just viewed design. So, I think that's really exciting.
Todd Libby: [00:40:45] Yeah. And we'll have links for everything in—in the show notes for after as well. And—so, the next question I usually ask, and this is a loaded question usually is if there were one thing that you could change about the web that we know today, what would that be?
David Dylan Thomas: [00:41:05] Oh, man.
So, the first thing that comes to mind cause there's way too many. So, I'm not going to say this is the best one, but I would love to see what happened if we did it. Salary transparency. If every single company in the world overnight started doing salary transparency for a year.
I would be fascinated to see the outcome of that because we know one of the few things that we've seen be relatively sure-fire in terms of closing the gap in terms of how much women make and how much men make is salary transparency.
It's companies with salary transparency tend to have smaller gaps there. Right? So, I'll be curious just to make that one little change and let it play out for a year or two and just see what else has changed as a result. What happens when this theoretically, what happens then is that women get more earning power and people of color get more earning power, right?
And when groups that haven't had earning power suddenly get earning power, stuff happens. Right? So, I'm really—it changes the way people vote. It changes like—there's so many like knock on effects. So I'd be curious to make that one tweak, that one to be perfectly honest, not that hard tweak, right?
Todd Libby: [00:42:14] Right.
David Dylan Thomas: [00:42:14] It's like, I'm not even asking you to fire your board and replace them with people of color, just have salary transparency. Right?
Todd Libby: [00:42:21] Yup.
David Dylan Thomas: [00:42:20] I'd be really curious to make that one little change and see, like, what are the knock on effects of that? Like what do we suddenly start to see more inclusive design?
Because people are being paid in a way that allows them to breathe. Right?
Todd Libby: [00:42:35] Yeah, yeah.
David Dylan Thomas: [00:42:35] And, and think more clearly about this stuff and care more clearly about that stuff. And again, I don't know, but I'll be very curious to see what happens.
Todd Libby: [00:42:44] Yeah, I would be too. I know, I know a lot of people don't—would be very curious. Yeah I've seen on Twitter, you know people posting the salaries and I'm like, "Oh, I don't know."
And if there's always a contentious debate about that as, as—as it goes along, that ends up on my timeline and I just—I just, I kinda stay away from that one. But—so the last question I usually ask my guests is—your favorite part of front end development or design or whatever you really like the most that you nerd out over.
And, and I have an a funny feeling it could be something to do with the book, but I'm not sure.
[Holds up the Design for Cognitive Bias book]
David Dylan Thomas: [00:43:22] Well, it's, it's funny, like when I think about stuff like that these days, and I'll give you an analogy. Cause like I said, I'm, I'm a filmmaker at heart. And what I nerd out about in film these days is what happens when people who haven't had the camera, get the camera.
So, when Ryan Coogler gets to tell a Marvel cinematic universe for a story, we get Black Panther, which is astounding and fundamentally not that different, story-wise, from every superhero movie that's come before. Like all the tropes are there, but because Ryan Coogler is telling that story from his point of view, we're getting the African diaspora, we're getting Afro-futurism, we're getting all these things and all these topics and all these concepts, these really super subversive contexts like Africa saving the rest of the world, right?
Like there's really subversive, like notions—that we don't get out of the standard issue, comic book thing. And I feel the same way with design, right? When design and front end development get in the hands of folks who are coming from a different perspective. So, there's a guy I'm, I feel bad. I'm blanking on his name, but, he's a black guy who works at Google and he is responsible for rolling out the black owned business layer on Google Maps.
Right? And this is something that he was inspired to do after seeing some protests in Oakland and, you know, over the weekend, he decides, this is the thing that must happen. And he starts, you know, working on it. And you know, long story short, it is now a feature on Google Maps. If you want to look for black owned businesses, that's a layer you can add.
And that is something that I doubt would have come from someone who wasn't a black man in Oakland, or it's less likely, I'll say this much. It's less likely that would have come from, you know, the theoretical skinny, white guy who we were talking about before, right?
Todd Libby: [00:45:00] Right.
David Dylan Thomas: [00:45:00] But when someone, so it's that new tools. Layers on Google are not some crazy, no way, new thing. Wow. I didn't know. No, no, no. We've had layers on Google forever, but but that it's coming from this person's perspective or perspective that we don't have a lot of in tech. And not at Google, frankly. Right?
We-We've seen their numbers, but when folks, when people of color and people who have been disenfranchised from tech, get enfranchised, get a hold of the tools, really exciting things happen.
So that's what I geek out over, it's like, what happens when we give the old tools to new people.
Todd Libby: [00:45:32] Yeah. Yeah—and I'd like to see more of that person. Yeah. It's, I mean, there are—I am one that likes change. I am one that doesn't like to sit in his comfort zone and—and you know, talk to what pretty much amounts to me in the mirror.
All the time I had different perspectives for me anyways, educates me and—and you know, expands my, you know, outlook on things. And, you know, I've met a lot of terrific people getting out of my comfort zone and saying, you know, "Hey, I got this question or whatever," so yeah. Yeah. I'd like—I'd like to, you know, that a lot of good stuff there.
So, as we close out I always let my guests know—and we've talked about a few things that you've been doing, but you know, what you have currently going on and where people can find you in, you know, in anything that you're doing in the future that you can talk about. So, the floor is yours.
David Dylan Thomas: [00:46:41] Sure. Yeah. I'll, I'll, I'll be blunt.
I am—got my own business now. So, if you need someone to give a talk, that's going to get people excited about any aspects of inclusive design or how to Jedi mind trick your boss. Right? If you need someone to come in and give a workshop they'll get you excited about and give you some real hands-on experience with inclusive design.
I'm good for that. Or if you just want a book, that'll teach you more about it. So, all of those things can be found at daviddylanthomas.com and that's a great place to either get my book or just get in touch with me to talk about what you're, what you're up to and how I can help. But that's what I'm into these days.
And the other thing I will say, I always like to encourage people is shop at black owned businesses.
Todd Libby: [00:47:14] Yep.
David Dylan Thomas: [00:47:14] Mine included. I actually am a black owned business now, but,but,but broader than that, like and there's a website called rebuildblackbusiness.com, which can give you literally 14,000 options to do that.
Todd Libby: [00:47:25] Yup.
David Dylan Thomas: [00:47:25] But I think that's a great way like I was talking before about economic empowerment. That's a great way to show you're an ally is to, to spend money in the community.
Todd Libby: [00:47:34] rebuildblackbusiness.com. Okay. We'll can put that in the show notes as well. Again, I want to thank you for coming on. it's a pleasure.
The book again, I can't say it enough, it's a—it's a great book and I'll hold it up again. Design for Cognitive Bias by David Dylan Thomas from A Book Apart. That's—the whole book, A Book Apart series is great.
[Holds up the Design for Cognitive Bias book]
David Dylan Thomas: [00:47:55] Yeah. They've been doing some great work and they're a pleasure to work with if you're considering like who to work with for your—for, for think about writing a book, they are amazing.
Todd Libby: [00:48:04] Yeah. Yeah.
Nothing but good things that I've heard. And customer service is great as well. That said, you know, your personal site we'll have that again, all the—all the links and show notes, it will be in the show notes and everything. And I want to thank you again for coming on and spending part of your time with me this afternoon and, you know talking about bias and the web and everything. So—
David Dylan Thomas: [00:48:28] Thank you for having me. It's been a pleasure.
Todd Libby: [00:48:31] So, as I close out, thank you listeners for tuning into the Front End Nerdery Podcast. I'll be back next month with a new guest new topics and new conversation about front end design development and more. If you would please rate this podcast on your podcasts device of choice, like subscribe and watch on the Front End Nerdery YouTube channel links, transcripts and show notes will be there on the YouTube channel as well. I'm Todd Libby and this has been the Front End Nerdery Podcast. Thank you. And see you next time.