TECH TAKES

The ‘E’ Word: Understanding the Ethical Implications of New Technology

Louis Savard / Mark Abbott / Wayne Collins Season 1 Episode 1

As we navigate through our rapidly changing world, where ethical considerations in innovation have become more critical to mitigating risk and challenging our values, organizations will benefit today from adopting stewardship and ethics to ensure technology is used to make the world a better place. 

Host Louis Savard, C.Tech., discusses with Mark Abbott, director of Engineering Change Lab and Wayne Collins, a faculty member of the computer science and information technology department at Mohawk College, about the implications of ethics and technology and look at the world of technology and its impact on ethics in technology.


Have a topic you’d like to discuss or comments about the episode? Reach us at techtakes@oacett.org.

[Start of recorded material 00:00:00]

Louis Savard:    
Hi, I’m Louis Savard, and welcome to OACETT Tech Takes podcast. We are thrilled to air our first episode today and have you join us for our conversation on the E word, understanding the ethical implications of new technology. 

As we navigate through our rapidly changing world where ethical considerations in innovation have become more critical to mitigating risk, organizations will benefit today from adopting stewardship and ethics to ensure technology is used to make the word a better place. 

[00:00:38] I am joined today by Mark Abbott, Director of Engineer in Change Lab and Wayne Collins, faculty member of the Computer Science and Information Technology department at Mohawk College to discuss this topic. During this episode, we will discuss the implications of ethics and technology, whether ethics and technology and co-exist, and look at the world of technology and its impact on ethics and technology. 

[00:01:04] That’s a lot of ethics. Mark, Wayne, welcome. Great to have you both here today for our first podcast. Now, let’s get right into it, let’s do this. Society has forever been dealing with technological advancements, however small, large, it’s always happened. Particularly today with industry 4.0. Mark, let’s kick it off with you and let’s hear your thoughts on today’s rapid pace of technology and why it’s important to care about it.

Mark Abbott:   
 [00:01:38] Thanks, Louis. And yeah, I like how you framed that as, you know, looking back at kind of the history of technology and society and what’s kind of changing today with the rapid pace of – you know, one could argue that the most defining kind of attribute of our species is our use of technology. 

[00:02:01] And some social scientists will argue that we’ve been kind of co-evolving with our technology since the beginning of humanity So, our physical biology has even been kind of intertwined with the evolution of our technologies. And I think what’s happening today is the power and the pace of technology is such that we’re seeing the impacts a lot more clearly because they’re happening within generations, even within years, whereas in the past the changes were much more subtle because they were rolling out over hundreds of years.

[00:02:37] Whereas, this acceleration right now is actually telling us things that have always been true about the nature of our technology, but have been less apparent or kind of obvious to us. 

Louis Savard:    
Yeah, that’s right where I was going with however small or large these advancements are, there’s always been impact felt over time and now it’s just something happens today, you feel it tomorrow, right? So, just to go back quickly, why do you think it’s important that we care about these impacts?

Mark Abbott:   
[00:03:07] Yeah, I mean, I think if you look at like some of the big technology questions today, like obviously AI and data is one of the big kind of themes. Climate change is another question I think of, of technological stewardship. It’s interesting, like when we think about sort of our stories of the future, when we go to the movies, most of the movies are directly of the future, are directly or indirectly cautionary tales of getting our relationship with technology wrong. 

[00:03:32] You know, Terminator, Matrix, Wall-e, Black Mirror, is a whole – on Netflix, there’s a whole anthology of kind of failure modes of us getting our relationship with technology wrong as a species. And yet, day-to-day, it’s kind of hard for us to connect to. Even if we’re technologists or technicians or working kind of directly in the creation and application of technology, I think we struggle to kind of connect the dots from these kind of Hollywood stories and how things that could go wrong, to translate that into, “OK, wait a minute, in my day-to-day work might seemed remove from that,” but, actually, it’s all of these decisions that we’re all making kind of in the trenches of the creation and application of technology that will kind of add up to where – how the story unfolds kind of at the macro level for us as a species. 

Louis Savard:   
[00:04:16] Right. Wayne, same question, why is it important to care about how fast technology is evolving?

Wayne Collins: 
Yeah, your question makes me think about regulation a bit. I think it’s important to note that regulation always lags technology. Like technology always advances and changes much fast than we can control in any structured way. There was a time back in the 60s when LSD was legal, you know, it was a drug that was developed and it was regulated quite quickly because it was obvious who was being harmed by this. 

[00:05:02] The evidence was very physical and real. Like I feel like it’s something like social media, how do we regulate this? Should we should regulate it? Who’s being harmed? And it’s a lot harder sometimes with emerging technologies to see who’s being harmed. And I think that makes it very difficult to develop regulation to protect people. So that’s kind of where your question took me was thinking about regulation. 

Louis Savard:   
[00:05:33] Thank you. Seeing how you’re in the hot seat already, Wayne, let’s keep going with you here. Recently, you’ve been working on researching moral reasonings with OACETT, and, more specifically, you’re looking at correlations between preparing, and I’m going to highlight the word, ‘preparing’ for ethics exam and an increased moral reasoning ability. Now, this nuance is really interesting to me, because you’re saying preparing for the exam and not passing or not the exam has an impact on increased moral reasoning. 

[00:06:08] Could you tell us a little bit more about that research and why do you feel it’s so important?

Wayne Collins: 
I started this about three years ago. I was at Central Michigan University and I knew we had a problem, in that where I worked in Mohawk College, we weren’t doing a good job of teaching ethics. 

[00:06:30] I mean, I know it, and I can see my own environment well. I can’t see every college program in Ontario, but I can comment on my own programs. I don’t like saying things like that, “I can measure things. I like to quantify stuff, right? I’m a techy. I want to measure it.” So, I didn’t have an ethics course yet. As it turns out, we are creating one, but three years ago I didn’t have an ethics course. 

[00:07:00] So, I thought I would have two groups of people OACETT members, and non-OACETT members who are out there working, technicians and technologists out there working, a very large supply, right? There’s tens of thousands of you. And the preparation for the exam is more important than the actual exam itself because the preparation is the exploring ethical topics and learning about ethics. 

So, I was hoping to use those two groups of people. Unfortunately, it’s very difficult to find volunteers to engage in challenging little research projects, so I don’t have enough data to share with you any kind of conclusions.

[00:07:40] You know, I think what I’m getting at with my college and my work is that technicians can cause harm. Technicians have the ability to harm people with their work. So, ethics education is required. It’s mandatory. It’s not an optional thing;. 

[00:08:34] 

Louis Savard:    
OK. Super interesting. I get a sense of where the answer to the next question is going to go, but I’ll keep you in the hot seat for one more, Wayne. And what do you think the value of the ethics curriculum at the college level would be?

Wayne Collins:
[00:08:59] I really want to make two points. There’s two things that I can say to answer this. One is we’re teaching life skills. These are business skills. These are skills students need to survive in industry. I’m talking about listening, civil discourse, showing true, genuine empathy for another person’s point of view. Not only solving a complex problem, but being able to articulate and explain your position to somebody else because your co-workers may not agree with you.

[00:09:32] So, just solving the problem is not enough, you’ve got to explain it. Like these – I hate calling these things soft skills, that’s the worst name for them. These things are really, really hard; they should never be called soft skills. But that’s one thing that we’re doing, these are critical life skills our graduates need.

[00:09:51] The second point that I would make is – I can’t emphasize enough the importance of practice and rehearsal. You cannot take a technician who’s graduated from school, but him in an ethically challenging position in industry, expect him or her to solve this ethical problem while they’re under stress doing it for the first time and perform well. This is crazy. They have to practice this at school in a safe environment. They have classmates they can discuss it with. They have time to reflect on it. We have lots of case studies they can practice with.

[00:10:39] When you’re learning a complex skill rehearsal is critical. You cannot ask graduates to do this for the first time under the pressure of a job situation. It must be practiced at school. And there’s just no other way around this, we have to have an ethics course as part of our curriculum. Like we got to give students a chance to practice this while they’re in school, a nice, safe environment, and then when they’re out there in the working world and it really happens they’ll be better prepared to handle it properly. I’m swinging for the fences, but that’s what I want. 

Louis Savard:    
Well, I really appreciate you saying that, Wayne, because I think I’m in agreement with you 100 percent on that one. It starts at school, right? You can’t expect somebody to get into the workforce and, “Here’s your 20-minute video you get to watch and you’ll get a printed certificate at the end and you’re good to go.” 

[00:11:39] In this case, it’s bigger than this, right? So, I appreciate your point of view.

Now, Mark, I’m going to throw you a little bit of a curve ball here and I’m going to ask you, from an industry perspective, where do you see the value of college graduates undergoing these ethics curriculums before they graduate? 

Mark Abbott:   
[00:12:01] Yeah, I think what’s becoming clear and clearer with the acceleration we’re talking about with industry 4.0, tech is critical to the future of every organization, every company, government agency, non-profit, professional association, right? And so, that’s almost a given. Everyone kind of just accepts that. And yet, when you actually look at some of the statistics around, sort of, the success of internal tech change initiatives or even, sort of, the impacts of tech that’s being kind of offered by various organizations to customers or to the world. 

[00:12:35] The performance is actually not always all that great. And that’s almost becoming a little bit of a given, as well, too. And what people are more and more pointing to is it’s not the hard, technical skills that are often causing the barrier, it’s the socio ethical considerations around the technology and how we’re navigating those things. And the challenge with the current system is we, on one side of campuses are training technologists, technicians, engineers, computer scientists, to create technology, to create the actual artifacts, and some are on the other side of campuses we’re training a different group of students to understand the nature of technology and its impacts. And those groups almost never meet. 

[00:13:13] And so, if you kind of look at the current situation, we’re training people to create something that they don’t actually fully understand the nature of or have the ability to steward responsibility. Which to brings to mind, actually, the classic science fiction novel of Dr. Frankenstein.  Dr. Frankenstein was the doctor, not the monster. He had the ability to create something that he didn’t fully appreciate the nature of or have the ability to steward responsibility. 

[00:13:37] So, if we were taking kind of a critical lens from a societal perspective, you could argue that we shouldn’t even begin to impart the ability or to support people to have the – to gain the ability to create and apply technology until we’re first sure that they have what we call a tech stewardship practice or the foundations of a tech stewardship practice in place. 

Louis Savard:   
[00:14:03] Wow, that’s great. The, I guess, bottom line is Dr. Frankenstein should’ve had an ethics course before practicing and the movie would’ve been way different.

Mark Abbott:    
And something we – in the years of kind of coming to his, so I’ve had the privilege of helping support a few kind of multi-stakeholder initiatives over the last eight years working with leaders from industry and government and non-profit and professional associations to kind of look at this bigger question of: given the role technology is playing in society, what’s it going to take to actually make sure technology is beneficial for all? 

[00:14:39] And, of course, we need policy, we need general public awareness, but the specific question of, wait a minute, what’s the responsibility or opportunity of the people who are creating and applying technology to bend the arch of technology towards good? And so, we came to the term, ‘technological stewardship,’ as a way of talking about this. Partially, because with some of the kind of limited paradigms or ways of thinking about technology right now, the word ‘ethics’ has been hollowed out in some cases in practical use.

[00:15:10] And it’s kind of almost sort of been – many people take it in a very narrow sense. Tech stewardship and tech ethics could be synonyms if people were seeing, kind of, ethics in the, sort of, larger kind of view where it’s about much more than just not break laws and kind of a minimum bar, it’s about how do we proactively navigate the value tensions that are associated? All the messy gray zones that are associated with our decisions as we’re creating and applying technology.

Louis Savard:   
[00:15:37] Right. So, I’ll bring you back to the word ‘stewardship’ that you just mentioned in your – as you talk about tech stewardship, but from my seat, every time I think about stewardship the first thing that comes to mind is a meadow with a pond in the middle, you know, a duck floating around and let’s protect the pond and let’s save the ducks. Very rarely do I see anybody saying, “Let’s protect the endangered Nokia cell phone,” right? 

Mark Abbott:     [Laughter]

Louis Savard:   
[00:16:05] So, I’m wondering if you can maybe enlighten us a little bit more on what tech stewardship actually is. 

Mark Abbott:    
Yeah, we spent a lot of time in this ongoing, kind of a multi-stakeholder initiative trying to come up with the right language to kind of get at what we were talking about. And there is no perfect, but the reason we landed on stewardship was this sense of a proactive shared responsibility. And so, we believe that if you’re involved in the creation and application of technology you, by default, sort of owe it to society to be proactively navigating these socio ethical, kind of, questions that are around the work.

[00:16:41] And so, stewardship, obviously, a lot of people associate with environment stewardship, which, of course, has that same kind of connotation of proactively trying to steward the environment. A criticism of environmental stewardship, by some, in particular, a lot of Indigenous leaders will say that it’s a very anthropocentric or, sort of, human self-obsessed and almost arrogant view to believe that we could steward the natural environment. 

[00:17:06] However, if you think about our technologies, technologies are human. We’re creating, by definition, it’s humans that are creating technology that’s having all these impacts on the natural environment, on our social systems, and environments kind of in the broader sense. And so, really, we have a responsibility to steward, kind of, the collective evolution of our technology so that we are being responsible in our relationship to the environment, to our communities, to our culture. 

[00:17:33] So, stewardship, we found, really kind of captures more of the essence, this proactive, shared responsibility and over something is inherently human that we’re creating. 

Louis Savard:    
So, let’s keep it at that same vein, Mark, and I’ll keep you in the hot seat for just one more quick minute here. What are the challenges that technological advances we’re seeing today, why is tech stewardship a good fit for the technology sector looking at those challenges?

Mark Abbott:   
[00:18:07] Yeah, and it comes back to something Wayne said. So, when we were exploring this question of, “How do we ensure technology is beneficial for all?” you know, our belief is as much as we need general public awareness and policy, that’s kind of the horse has already left the barn, right? We need to actually get upstream to the people who are creating and applying technology and start to influence the day-to-day behaviours. 

[00:18:28] And, as Wayne says, that needs to be a practice. That needs to be an ongoing habit. It’s not a black and white checklist or set of rules or a course you do once, it’s actually building an ongoing practice of, sort of, socio ethical reflection around your technical work, around your innovation work, as you’re creating and applying technology. And so, what we’re doing with tech stewardship is actually, kind of, bringing in like a framework for that thinking, but then supporting people day-to-day in their own context, you know, in the trenches of their company or of their government agency as they’re encountering these tensions day-to-day in their work, to be able to notice them, name them, and navigate them. 

[00:19:07] So, to take some, a few examples of technology, kind of, related tensions, the one everyone is really kind of familiar with is cell phones, right? We all kind of, I think, have tug-o-wars in our head around the cell phone. Like, I value my convenience, but I also value my privacy. And when I download an app on my cell phone and I flip through all the accept things, I feel a little conflicted. It’s like, “OK, you know, I don’t have time to read this privacy and I don’t really want to give up my privacy, but, you know, man, I’m in a rush.”

[00:19:33] And so, I have these little tensions in my own kind of values, conflicting values around technology. And, of course, you know, that tension doesn’t exist in a vacuum. There’s companies that have made decisions around how that technology is shaped, and regulation regimes, and societal norms. So, at the moment, to take the cell phone example, things are kind of stacked in one direction. Like, a lot of those existing systems are to get me to give up my privacy so that I can get in the name of convenience. 

[00:20:00] But we don’t have to settle for those trade-offs if we actually do a better job of saying, “Actually, how can we get the best of privacy and convenience when it comes to cell phones?” And it’s not just new techs like that and kind of personal consumer techs, you can think about a power distribution system and whether or not, let’s say, you know, a large power company is deciding about centralizing power generation and distribution versus having a more distributed grid. There’s benefits to both centralization and de-centralization when it comes to a power grid and those are actually value tensions that have a big impact on the communities that those power systems will change.

[00:20:34] So, how do we actually start having conversations about technology that are values-based? And are recognizing that different people have different perspectives and different, sort of, ways they value things and, therefore, there isn’t one rational right answer to most problems, it’s more how we find the both ends solutions that get the best of both sides of some of these value tensions.

Louis Savard:   
[00:21:01] So, knowing what you both know, right, experts in your own fields, challenges within the technology sectors, how do you feel ethical technical stewardship training, how would that benefit us as a society? And we’ll start with Mark.

Mark Abbott:    
Yeah, so, I mean, this is one of the reasons I’m so excited about the direction OACETT’s taking and really prioritizing ethics and weighing into this question of, you know, for a long time there was, sort of, an ethics exam and kind of like the base requirements. But I think OACETT really starting to get into this, “Wait a minute, the big ethical questions of the future aren’t black and white. They’re how we navigate these value tensions.” 

[00:21:45] And so, OACETT’s contribution to helping create the concept of tech stewardship, and now some of the conversations we’re having about, “How do we support the OACETT membership to develop this ongoing socio ethical reflective practice to support each other, to grapple with these real life value tensions?” You know, like, hey, I work for a company and I care about the climate, I care about equity and diversity, and we have to make a profit and serve our customers. 

[00:22:09] Like, rather than ignore the inherent tensions between those things, if we actually create the space to support each other we can get much better outcomes that actually are, sort of, win, win, win, where our own values are aligned with the needs of the organization and the society and we’re finding, kind of, ways to navigate these tensions and with much more skill and unlock the potential within them. 

Louis Savard:    
And Wayne, your thoughts?

Wayne Collins:
[00:22:36] Yeah, I – from the perspective of working at a college, I’m – I guess, the short answer is, what challenges in technology? Well, everything, I guess. I guess, that’s the short answer. But what I want my students to do is to open their minds and consider all dimensions of a problem and all layers of a problem. I don’t like using expressions like ‘computer ethics’ or ‘accounting ethics.’ 

[00:23:07] I don’t like saying things like that, because when you say ‘computer ethics’ you’re putting on the blinders and you’re calling it a computer problem. It’s not, it’s a problem about people. It’s a problem about human suffering. So, when you say ‘engineering ethics,’ engineers want to solve it like an engineering problem and use math and solve this problem the way they solve every problem and it doesn’t work. 

[00:23:33] You can’t solve an ethics problem using engineering principles. This is a different type of a skill, So, like, for example, picking up on what Mark was talking about with cell phones, a great example about the tension that you experience and privacy versus convenience. But I want my students to go further with that and say, when we manufacture cell phones are we harming the environment? Are we doing rare earth mining to make these fancy screens and touch screens and this technology and these batteries, are we hurting the planet? Are we harming future generations with our obsession with, whatever, gasoline, cars, or gadgets, or whatever? Are future generations people that we can harm? Like, I want students to unpack it.

[00:24:26] And it comes down to competing values because we want industry, we want manufacturing, we want those jobs, we want the economic engine to run, we want the benefits, but if we hurt the future, like, do – are those people people, too? And do they count in the conversation? So, I want my students to explore it all. And so, it’s – I guess, the answer of your question is, yeah, it affects everything. I hope that’s not too broad. 

Louis Savard:   
[00:24:53] No, I don’t think it’s too broad, I actually think it’s eye-opening, right? I think it’s – is people feel immediate pain, they react to it, they want to fix it, right? But if they don’t feel pain now, you know, I’m going to go to the doctor my next check-up next year, right? So, this is kind of the same concept where you have to be aware of what future pains you might be creating now with what you want to push out because this is the next thing, right? It’s much more than just putting the next iPod or iPad or iPhone or android, I’m not going to leave those guys out, you know, on the shelf. There’s a lot more to consider. 

[00:25:32] Mark, I’m going to come back to you on this question here, because, as you may be aware, OACETT does have an emerging markets committee, which includes looking at emerging technologies. This is a term that we’re often familiar with, emerging technologies, like what’s coming down the pipeline. But there’s also discussions around ethics within this committee, and this is part of why we are here today to discuss.

[00:25:57] One term, though, that I don’t think is discussed enough or that is even brought up sometimes is technologies of concern, right? And I’m wondering if you can give us, maybe, an example of a technology, a concern, and how this program, you know, our tech stewardship style ethics program, could help OACETT members consider the process in evaluating the ethics of it all? 

Mark Abbott:   
[00:26:20] Yeah, I mean, when you talk about emerging technologies, we tend to be talking about things that are rapidly developing these days and having huge and far-reaching affects and it goes back to what Wayne just said, I mean, the good news is I truly believe we have everything as a society we need to be able to actually navigate the inherent tensions. Because even things that on the surface seem really great are, you know, have these kind of complex impacts that are mixed, right? 

[00:26:45] In fact, if you stop and think about it, a lot of today’s challenges are the result of, you could argue, are the result of yesterday’s tech solutions. Like, we didn’t set out to alter the climate, we set out to power society, right? We didn’t set out to create screen zombies. We didn’t set out to create cities for cars and not people. Those, we were solving yesterday’s problems, but, actually, sewing the seeds of the next wave of even bigger problems. 

[00:27:10] So, you know, Einstein famously said, “We shouldn’t try to solve today’s problems with the same level of thinking that created them.” I think the good news is, building on what Wayne said, is we have everything we need as a society, but it’s siloed, right? We have these technologists and the technicians creating and applying technology over here, we have the policy makers over there, we have the artists over here.

[00:27:28] I think with what we need is to break down those silos and say, actually, if we’re going to rise to this challenge of as a society shaping our relationship with technology to make sure we don’t wind up in a Terminator future, it’s going to take us all, kind of, lending our expertise, and our perspectives, and kind of rising to that challenge together. 

[00:27:48] And I think the trap we often fall into now is when technologists and technicians try to engage outside the silo it gets pulled towards the technical aspects of these technological artifacts themselves. Oh, I could, you know, someone who is not a technologist or a technician might say, “Oh, that’s too complex. I don’t understand all of that,” but that’s not where the conversation needs to lie. 

[00:28:06] If we instead talk about these emerging technologies from a values perspective, like, what values are this – the technology really helping to kind of strengthen in society and what kind of counter values might, sort of, suffer as a result? So, if you look at, say, for example, Facebook’s a classic one that everyone kind of talks about around new technologies, right? Facebook succeeded remarkably in terms of its stated vision around, kind of, connecting people in community, right? 

[00:28:36] But a lot of that, over time, as we seen, has come at a cost of privacy and a bunch of other kind of issues around to it, right? So, whenever you bring out one of these really destructive technologies, in particular, to look beyond it and almost foresee, what are the values that are being championed and what might be some of those, sort of, inter-related, kind of, other values that might suffer as the result? And get better at, kind of, actually being proactive so we don’t have to go way down the road and really be hit over the head with the negative consequences before we start to correct things. 

Louis Savard:   
[00:29:09] Yeah, you know, through our discussion so far I think both of you are starting to change my way of looking at this whole question or situation, if you will, and really look at a values angle. Like, when you look at it from that perspective it becomes a lot clearer is what I’m getting so far in this brief interaction right now. So, thank you, both, for that right now. 

Mark Abbott:   
[00:29:39] And just to say we’ve been – we started out kind of philosophical, which I think is great, and – but this does really connect to the ground day-to-day. So, this – the tech stewardship concept that OACETT’s been involved in creating and the practice program that we’re now talking about introducing to the OACETT community is how does all of this philosophical talk, understanding the nature of technology, understanding these value tensions, how do you translate that down into the day job, right? 

[00:30:06] You know, your [inter media? 00:30:06] kind of in a company or student in school, what are the opportunities, day-to-day, to exhibit behaviours, to bend the arch? And so, what we do at the tech stewardship practice is we have people share those stories of, like, “Hey, I was feeling this tension in my work and here’s something that I tried that kind of worked or that didn’t work.” 

And by sharing those stories with each other of those practical little nudges day-to-day of, “How do you bend the arch?” I think that’s going to be – that’s how this – the philosophy gets connected to the ground in a really practical way where OACETT members can support each other and connect to beyond technologists and technicians in these other kind of groups to, actually, practically move the needle day-to-day.

Louis Savard:   
[00:30:48] Yeah, bring the philosophical to the tangible, right? If you can do that then we’re doing well. I’ve got an interesting question here for both of you. The answer is probably very obvious, but I think this is – it’s quite a hot topic and probably one we  could do a whole other podcast on. But the question is: why should technology benefit everyone? And if it did, is there an easy way to ensure that there’s an equality on the outcome or that it's even equal for everyone? Now, let’s start with you, Wayne.

Wayne Collins:
[00:31:27] I think the short answer to this is simply, no. It’s never going to be equal for everyone. Whenever you have winners, you always have losers. It may not be obvious who the loser is, but you got to go look a little bit, you got to dig in here. What kind of example can I give you? There are communities in Northern Ontario that have not had clean drinking water for 10 years, they don’t have a decent high school, they’re probably not listening to this podcast because they don’t have broadband and internet access. 

[00:32:03] I’m not talking about third-world countries over seas, I’m talking about Ontario. Do you want to go up there and ask them how equal technology is? That’s not a pleasant thing to do. They don’t have clean drinking water. You know, there’s no way that the technical benefits of modern society can be equal for everybody. The wealthy are privileged; it’s as simple as that. That wealthy are privileged.

Louis Savard:   
[00:32:34] Mark, thoughts?

Mark Abbott:    
Yeah, the way I see it is with tech stewardship we use the language of how do we ensure technology is beneficial for all, but I agree with Wayne, it’s not – beneficial for us is more of a North Star than a destination we’ll ever reach. It’s something to strive towards, to pull towards, to try to find the both end solutions that actually don’t just kind of accept the fact that it’s an either or but say, actually, when we engage with these tensions in a productive way there are often ways that we can meet the needs and perspectives and serve the perspectives and values of different groups. 

[00:33:07] And that all starts with, actually, is self-awareness, because what happens now is the values of the technologists and technician community, and the others that kind of create and apply technology, tend to get privileged in society right now. 

So, an example was, I was talking to – a while back I was at a – running a workshop and there was some folks from the big energy provider in that province and they were talking about in an early break someone came up and said, “You know, oh, we have to get better at kind of explaining things to the general public because there’s debate about three routes that the power lines could come from up north down to the major city. And there’s such an obvious answer about what the right solution in and we just need to get better at explaining that to society.”

[00:33:50] So, you know, patiently and kind of curiously I was like, “OK, what do you mean by the obvious right answer?” And so, it turned out that what this person saw is the obvious right answer is the most economically efficient and kind of straightforward route from a design and construction point of view. That’s what he was valuing. That was the dominant values of kind of his community and of that company. And so, therefore, that was the obvious route. 

So, when I asked about the other routes, well, there was another one that went through traditional Indigenous lands and another one that was actually – had lower environment impacts. In this person’s mind, the had already, kind of, based on their own values and the dominant values of their group, there was an obvious right answer. 

[00:34:30] And if we keep operating in that way, the values of a subset of society will continue to be privileged. I think as the people involved in the creation and application of technology, if we’re going to wield that power of creating this physical, digital, biological infrastructure for the world to interact with, we actually have a responsibility to not just bias through our own values, but understand the perspectives and values of different groups and try to find solutions within that that actually serve the multitude of perspectives and values in a better way.

[00:35:01] To me, that’s what going towards the North Star a benefit for all practically looks like.

Louis Savard:    
All right. Well, talking about North Stars, let’s see if we can make our PPE or professional practice exam, wow, three – two Ps is tough to say, a little bit better. And, Mark, let’s ask you this question here, would you think that including tech stewardship as part of, or as a new pillar to our professional practice exam, which includes law practice and ethics, would augment our current requirements? 

[00:35:40] And by ‘augment’ I don’t mean make it harder, I mean make it more better rounded to meet today’s challenges. 

Mark Abbott:    I’d say yes, but we can’t stop there. Because I think the way it’s treated right now is, you know, “Did you pass the test?” It’s, you know, one time sort of thing, “Are you ethical? Yes or no?” That’s not what being ethical actually should mean in real life. Ethical is an ongoing, kind of, grappling and navigating these tensions on ongoing basis. 

[00:36:14] So, what I’d love to see is in that initial kind of training and certification we’re actually launching that ongoing practice of ethical, socio ethical reflection or tech stewardship or whatever you’d want to call it. And that’s some of the exciting conversations that are happening with some of the OACETT leadership now is, what if, at some point in the future, to have an OACETT certificate, what is that signalling to employers and others is the presence of not just having passed an exam sometime way back when, but actually maintaining an ongoing socio ethical reflective practice. 

[00:36:44] That is telling the employer, and others, and clients, and other people, that this person because they have an OACETT certificate is constantly grappling with these grey zone tensions and finding ways forward and not putting their head in the sand or not even recognizing the tensions as they arise.       

Louis Savard:   
[00:39:23] Great. Well, Mark, Wayne, I just want to thank you, both, for a very stimulating conversation. I mean, I don’t know about you guys, but I could keep going here. It’s a topic that I don’t think is spoken about enough. I also want to thank you for indulging us and being a part of our fore into the world of podcasts. And I think we just set a bar. I really do. Any final thoughts, Mark, before for our listeners? 

Mark Abbott:   
[00:39:56] Just to say, you know, I’ve really enjoyed getting to interact and work with OACETT leadership over the last several years in helping co-create tech stewardship in this kind of path forward. And I’m just super excited about the potential for OACETT to continue to be a leader in actually – in tech stewardship in this kind of new, enhanced version of ethics. Because I think that’s the future of certifications and licensures and whatnot in the sphere of technology. The biggest questions I think ahead are around the socio ethical kind of questions as opposed to the technical skills. Yeah, are always going to be important but I think the socio ethical questions are actually becoming more and more important. 

[00:40:37] And it’s just really inspiring to see OACETT position itself to continue and to lead in that space and to hear amazing examples, like the work Wayne’s doing at Mohawk. It just gives me a lot of hope. 

Louis Savard:    
Well, I mean, with, I guess, an outro like that, Wayne, I’m going to have to give you the last words. 

Wayne Collins:
[00:40:59] Thanks. I think I’d like to just point out that I’ve been working very hard on this topic for about three years now. I’ve been talking to everybody I can find at work about this, and I just like to share with you that I am receiving a ton of support from college management team and executive. I’m receiving a ton of support from my co-workers. I’ve got a lot of students involved in extra curricular activities, believe it or not, I’ve turned ethics into an extra curricular activity; I have not had a single person give me negative feedback or say this isn’t important or, “Ah, it’s a waste of time,” or anything like that at all.

[00:41:41] So, I just want to point out that places like Mohawk college are extremely supportive of this and are making it – and are making it a priority. So, it’s real. 

Louis Savard:    
Oh, it –

Wayne Collins: 
Thank you.

Louis Savard:      
– definitely is real. So, thank you again, Mark. Thank you again, Wayne, for spending the last 40 minutes with me here. This was fantastic. I hope we get to meet in the near future and chat some more. 

Wayne Collins:   
Thanks for having me.

Mark Abbott:      
Yes, thank you, Louis. 

Louis Savard:     
[00:42:14] And for anyone interested in learning more about today’s topic or if you have a topic you would like us to feature in a podcast, please email us at techtakes@oacett.org.  That’s Tech Takes, T-E-C-H-T-A-K-E-S@oacett, O-A-C-E-T-T-dot-org. I hope you’ll join us again for our next Tech Takes on three Ps to three Cs, a look at transition from private/public partnerships, to colleges, corporations, and certifications, working together to improve the overall outcomes of partnerships as we work to make Ontario better and stronger. And with that, until next time, bye for now. 

[End of recorded material 00:42:58]