Where we draw the line

There’s a lot of excitement surrounding machine learning right now.

The hype makes sense — robots can offer collaborative approaches for delicate medical procedures, text can be pulled from pictures and translated into different languages before being put back on the image and your entire online experience is able to be curated so that it’s just the perfect fit for you. Hell, when I wrote this piece I used a service built with artificial intelligence (AI) to transcribe some of the interviews I conducted.

But for every example where machine learning (ML) is going to revolutionize the way we live, you can find another example in which the potential consequences don’t seem to be worth the risk. At best, neural networks trained on vast amounts of data may produce correlations that are useless, but at worst, their findings may magnify the biases present in humans and become incredibly harmful. Over the past year, even students have been dealing with concerns surrounding privacy and the collection of sensitive data throughout UBC’s Proctorio saga.

Anouk Ruhaak is a current Mozilla Fellow who researches data governance models. They’re also the founder of Radical Engineers, an initiative that connects organizations that are challenging the status quo and taking steps to achieve justice with volunteer software developers and designers to provide them with the technical resources they need to effect positive change.

Ruhaak is someone who has experienced the fervour surrounding the potential of machine learning firsthand. While they think that sometimes this excitement is justified, very often, they find that it’s not.

“People tend to think that the machine becomes increasingly magical, because we don’t fully understand how we arrived at the answer,” they said. “But we’re still thinking it has some magical power, and it can tell us something that we don’t know.

“In those contexts, what I often find is that there’s very, very little awareness of all the different biases that creates.”

“When you talk to [the engineers] about it, they’re willing to gain the awareness. It’s just that no one’s ever actually brought it up.”

— Anouk Ruhaak, Mozilla Fellow and founder of Radical Engineers

This is not to say that work isn’t being done to discover and catalogue the ways applications of machine learning — and artificial intelligence more generally — can go wrong. In fact, it’s quite the opposite. The AI ethics field is enormous, with countless academic papers already written on how simply correlating a bunch of data — showing how one data set might correspond to another to drive the choice of which ML algorithm you use — can perpetuate biases already present in the data (a classic example of this is Amazon’s resume screener which taught itself to downgrade applications from women).

But according to Ruhaak, just because the work is being done, that doesn’t mean the engineers and scientists working on AI are aware of it. The idea that the people developing and managing this technology might not consider the potential impacts of what they’re doing was something Ruhaak found deeply concerning.

“If you talk to the engineers who are doing the actual statistics … they’re often not that aware of it,” they said. “When you talk to them about it, they’re willing to gain the awareness. It’s just that no one’s ever actually brought it up.”

Artificial intelligence and machine learning aren’t the only aspects of computer science that raise ethical quandaries — though they certainly are prominent. The products created by software engineers, developers, statisticians and data scientists can be found in just about every aspect of society. As such, the concerns surrounding the products of computer science span issues such as personal data collection, intellectual property and even individual freedom.

Last month, Slack, a business communication platform, quickly rolled back a feature that would allow users to send direct messages to anyone else on the platform after widespread criticism that it would open the doors for harassment and abuse. While some praised Slack for halting the launch of the feature, others were concerned about why an update with such blatant potential for misuse would even get to the roll-out phase in the first place.

Ruhaak said that Slack employees took to Twitter to explain they were speaking up about the implications of the feature before launch, but said “no one was listening for it,” including product managers.

In late 2020, and again in early 2021, Google made headlines for firing two of its top researchers in AI ethics. One of the researchers fired, Dr. Timnit Gebru, was a co-author of the pioneering Gender Shades project, which found that some of the industry’s leading gender classification services were worse at accurately identifying women with dark skin compared to people with lighter skin.

According to Gebru, she was fired after refusing to retract or remove employee names from a then-unpublished paper on the harmful biases in the AI systems that Google’s search engine is built upon. After defending Gebru amid the controversy, the creator of Google’s ethical AI team, Dr. Margaret Mitchell, was fired as well.

The COVID-19 pandemic seemed to provide a “shock doctrine”-esque opportunity for Big Tech companies to push for the digitization of state functions. While tech corporations were selling their products as solutions to COVID-19-induced chaos, employees at companies like Amazon were being forced to work in what The Guardian called “unsafe, grueling conditions.”

“A lot of computer science and programming has sort of become its own engineering discipline. It’s strange to not see those same ethical standards being applied to us.”

— Patrick Lee, third-year cognitive systems computer science student

In May 2020, Amazon posted a minimalist tweet over a black background to show solidarity for the Black Lives Matter movement. At the same time, it was selling its facial recognition technology — which has been found to misidentify portraits of Black Americans as mugshots — to police departments that already disproportionately incarcerate Black people.

In another example, Google employees collectively decided in 2018 that they would no longer work on AI for drones, which led the company to pull out of a Pentagon contract. Ruhaak published a blog post detailing the event, where they argued that while employee activism isn’t the final solution to combating the negative impacts of tech, a software developer often wields greater power than a warehouse worker in the proceedings of their company, and their voice complements other tools like “pressuring politicians to regulate, raising consumer awareness and engaging in protests.”

They also acknowledge that bringing up the consequences of your work is not an easy thing to do.

“[The people speaking up are] not heard because no one actually wants to hear it, because you’re gonna have this problem where you’re the one criticizing the thing everyone else is super excited about,” said Ruhaak in our conversation.

However, there can be more malicious aspects of why certain viewpoints aren’t given the time of day in tech.

“Occasionally, if you’re unheard just because of who you are, like because of your gender or your ethnicity or whatever other thing, that is just frustrating and infuriating in its own right,” said Ruhaak.

Currently, the computer science (CS) department at UBC has no requirements to include ethical reasoning and implications of tech in the computer science curriculum, but that doesn’t mean that sort of education isn’t happening at an elective or informal level.

The department offers an elective course, CPSC 430, which dives into how computational technology interacts with society. Some other CS courses include modules on ethics, and faculty often highlight examples within their classes where the technical skills being covered have been misused.

At an individual level, CS students can seek out courses in other departments that cover ethical reasoning and communication skills. Outside of the classroom, students form clubs and organize events that prioritize values of inclusivity, diversity and accessibility. The remaining question isn’t if these sorts of skills fit into a computer science education, but if what’s being done to develop them is enough, and whether there is a responsibility to do more.

As part of this story, Ubyssey Science ran an ethics and tech survey to garner a better understanding of how computer science students at UBC thought about their work, and the department’s responsibilities in the face of the potential impacts of computational technology.

One survey respondent wrote that computer science students “are incredibly bad at this stuff and the current curriculum is both woefully insufficient and also a better one may not even help, you can’t change people's politics unfortunately.

“I saw student proposals for an AI project in a second year CPEN class and one person wanted to build a thing to detect criminals based on surveillance footage (uh because that went so well on the news last time), others wanted to rebuild the attention sensing pieces of Proctorio. If people consider it fine to build these things as a toy, what will they do at work?”

Dr. David Silver is an associate professor at the Sauder School of Business and director of the W. Maurice Young Centre for Applied Ethics. He’s also the person responsible for conducting internal-facing ethics audits on faculties at UBC. These audits interrogate what a successful graduate would look like to each department, both in values and technical skills, and then test systematic ways the faculty can ensure they’re accomplishing these goals.

Silver has conducted audits for Sauder and the faculty of forestry, and is currently conducting audits on the faculty of land and food systems and parts of the engineering faculty. Silver said he recalled discussions around conducting an audit of the CS department, but they were halted by the pandemic.

Ruhaak believed that for university-level approaches to instill ethical reasoning skills that address tech industry problems, the lessons need to be accompanied by the historical context behind the technologies and their real-world consequences.

“Looking at the history of these technologies and how they’ve historically created inequalities or inequities … Who do they empower and who they do not empower?”

They also stressed the importance of “[having] some historical understanding of what has come before you.”

According to Silver, the university has a level of responsibility in educating students about the potential impacts of their labour.

“A profession, as I see it is, is two things,” he said. “It’s a mastery of a certain knowledge or skill and it’s a knowledge of how to ethically use that skill. That’s what makes a profession.

“… Universities over the last several hundred years, they’ve become more or less cognizant of that part of their role: Are we just about technical knowledge, or are we also about empowering our students to go out and use that responsibly?”

When you’re a computer science co-op student, you’re told to apply for every job you’re even remotely qualified for. Getting a co-op in tech is competitive, especially when it’s your first placement, and coordinators need to ensure you’re making as many opportunities as possible to get hired.

Jacob Hotz, a fourth-year computer science student, was going through this exact process before his first co-op term.

“I was literally filling out applications for everywhere, because I didn't have any experience and like, who’s gonna hire me?” he said.

He recalled stopping mid-way through one of many applications he steamed through. The company in particular was one that he felt was having a negative impact on the environment, enough to make him pause and reflect.

“I was like, do I really want to work there?”

Computer science students at UBC are front and centre in the discussion surrounding the integration of ethics into the CS curriculum. They’re the ones taking the courses, earning the degrees and becoming the future employees of the tech space. But just as varied as the types of people who major in CS are their attitudes towards how their department should be approaching the ethical and social impacts of their work throughout their education.

Patrick Lee, a third-year cognitive systems computer science student, cited the rapid development of technology and its consolidation by a few tech magnates as reasons why he thought every CS student should be thinking about the potential impacts of their work. Hotz gave the example of Facebook’s recent, now-reversed decision to ban news pages in Australia to show the large number of people decisions in tech can impact.

“In anything I’m doing, I’m interested [in] the effects,” said Hotz. “Just because I don’t want to do something that could potentially harm someone or hurt someone.”

In Ubyssey Science’s anonymous ethics and tech survey, one respondent thought that because CS students are “future tech leaders, educators, innovators and entrepreneurs, understanding the ethical and social impacts of CS is crucial towards creating fair and unbiased technology for future generations.”

But not all CS students at UBC share this same attitude.

23.2 per cent of survey respondents somewhat or strongly disagreed that CS students at UBC were responsible for considering the ethical and societal impacts of their work. One respondent thought that it was the job of individual students who care about advancing these causes to take them into account.

“The university should not be biased towards anything,” they wrote. “... I want to learn CS, not ethics. Leave the business ethics to the business school and ethics in general to the liberal arts.”

Another student thought that the current approach in the CS curriculum was sufficient, because courses on machine learning and artificial intelligence include discussions of bias and discrimination in their models. Another pointed out that CPSC 310 — the intro to software engineering course required in the CS degree — includes a module on ethics.

When asked about his experience with this module, Hotz recalled that a large portion of the class didn’t attend the lectures.

However, most of the respondents to the survey believed there is nuance around ethics and occupations, particularly when considering an individual’s background. For example, as a co-op student, the ability to be picky with the types of jobs you apply for is a luxury.

“In our modern economy, especially considering the effects of COVID-19, getting a stable job alone is already a blessing,” said one survey respondent. “There’s no room for ethical consideration unless people actually have stable jobs and alternatives, but, until then, why not address the ethics of the employer instead?”

CS students can also often feel immense social and cultural pressure to land positions at certain companies and achieve subjective markers of success in their degrees. Being able to avoid those pressures can be a privilege.

“Student[s] should take more responsibility in the future for their work and its impact on others …” said another survey respondent. “However, ultimately a coder is just a coder: we need money to sleep and eat, and only really do what the boss tells us, lest we be fired. The ultimate responsibility is on society and allowing these companies to exploit others.”

Yet classes at UBC aren’t the only opportunities for CS students to engage more with the impacts of their work and the ethics of the industry. nwPlus is a student-run club that organizes hackathons and other events for the local tech community, including nwHacks, Western Canada’s largest 24-hour hackathon.

Their HackCamp — previously known as UBC Local Hack Day — is the largest local hack day event in North America and focuses on creating hacks centred around inclusivity, diversity and accessibility. Another of their events, cmd-f, is British Columbia’s largest hackathon for women and non-binary people.

Allison Chiang, a third-year computer science student and co-president of nwPlus, said that their events are an opportunity to instill in beginners that the technology they create and contribute to isn’t just built for themselves, but for everyone.

“We felt that there was a disconnect between our education and the driving forces of the industry going forward,” said Chiang. “And we felt that it was a good opportunity for us, as innovators, as hackers, to really reflect on how our tech impacts society in general, even if it’s a little hackathon project that you do over a weekend.”

More than just presenting hackers with the opportunity to reflect on the implications of the work they’re doing, Chiang hoped that events like cmd-f help address industry shortcomings arising from a lack of diversity by encouraging women to try their hand at working in tech.

Yet even in their experience as a club, they’ve had to face some of the dilemmas computer scientists might face in industry. According to Chiang, nwPlus attempts to be selective with corporate event sponsors, saying that the club tries to ensure that their sponsors aren’t having a negative impact. However, their title sponsor for HackCamp 2020 was mining and energy giant TeckResources, who sponsored a prize for the hack that best addressed the problem of climate change.

In the ethics and tech survey, 58.2 per cent of respondents somewhat or strongly agreed that UBC is responsible for teaching students the potential social impacts of computer science, and the ethical reasoning and communication skills needed to consider this impact. Respondents proposed a range of ways to incorporate these concepts into the curriculum, beyond simply including a mandatory module on ethical theories.
Chiang said that the current CS curriculum is fairly full, but she would love to see themes of inclusivity, diversity and accessibility integrated into the courses. Lee said that beyond the classroom, he would like to see certain values incorporated into the culture similarly to how students in engineering adopt a Code of Ethics and values, such as being socially, environmentally and economically responsible for their actions, as their own.

“At this point, a lot of computer science and programming has sort of become its own engineering discipline. It’s sort of strange to not see those same ethical standards being applied to us,” said Lee.

Hotz said that when he was a TA for CPSC 110, an introductory programming course, the way that course staff set the standards for the learning environment was very effective. While he said some of his experiences TAing for other courses felt like an “old boys club,” the expectations for inclusivity in 110 facilitated an environment where, he felt, students would be met wherever they were at in their educational journeys.

“For a lot of people, [CPSC 110 is] a lot of people’s first exposure to the program. So yeah, I think that that’s really important in terms of inclusion,” said Hotz.
In March 2019, a robot designed at UBC made headlines for its ability to effectively manage acute pain for premature babies by simulating human skin-to-skin contact. The device, named Calmer, could be programmed to mimic a parent’s heartbeat and breathing rate. In January of this year, research out of UBC found that Calmer was also helpful for keeping preterm babies’ brain oxygen levels stable during medical procedures.

The benefits of parental skin-to-skin contact after birth are numerous, but when that contact is unavailable, Calmer could potentially assist with this type of care. Although the care assistance from Calmer is promising, as with any form of human–robot interaction, the project has given rise to a number of ethical considerations.

For example, after giving birth, the birthing person might be distressed, seeing their newborn comforted by a robot. In an attempt to alleviate this, Calmer mimics the heartbeat and breathing rate of the parent. In this way, the parent can feel like their baby is being comforted by an extension of themselves, rather than a disconnected piece of machinery.

While all aspects of computer science research raise questions surrounding potential social impact, human–computer and human–robot interaction have a certain proximity to ethical issues. Oftentimes, this is simply because there are behavioural research proposals that human–computer interaction (HCI) researchers need to have ethics approval for, in order to conduct their work. According to Dr. Karon MacLean, a professor in the department of computer science and co-developer of Calmer, incorporating ethics into HCI research topics is only becoming more prevalent.

Over the course of our conversation, MacLean and I covered issues of inclusivity and diversity in the participants of HCI research — which, at UBC, often solely consists of undergrads — that might not be representative of the world at large; of the collection and protection of vast amounts of personal data in the name of usability; of the inability to control the applications of your research after it’s been published; of potentially replacing human jobs with technological innovation; and of displacing real human interactions with robotic ones.

We barely scratched the surface.

MacLean has been a computer science researcher at UBC for 21 years, and worked in the field for another 10 years prior. But over the past five years, she’s seen the approach to ethical considerations in research change substantially.

“People have been having more and more comprehensive, sophisticated, nuanced ideas of what doing ethical research actually means, in the frame of interaction with people,” said MacLean.

The ability to have close and personal access to incredibly sophisticated technology is only a very recent development for humanity (I still remember having to sit for what seemed like forever in my dad’s office as I waited for the image of four waving stick figures emerging from their triangle, signalling to me that AOL had jacked me in to the world wide web).

But over the 20 years technology has become embedded in our societies, users have become more tuned in to issues that can arise from the internet and tech, observed MacLean, such as identity theft, cyberbullying and data privacy.

“I think the role of computer science and society is, including the academics, we need to spend more time educating politicians, going out and being public and talking about what we do and what the risks are, so that the public is more educated about it,” she said.


Dr. Cinda Heeren is a professor in the department of computer science, a computer science education researcher and a judge and mentor at various local hackathons, including nwPlus’s all-women hackathon, cmd-f.

In her own courses — which often include CPSC 221, an algorithms and data structures course mandatory for CS students — Heeren discusses issues of gender and racial bias in big data whenever she gets the opportunity. She also believes in a broad, liberal education where students can engage with material from various disciplines.

Even though the idea of an education that pumps out ethically rigorous students sounds great on paper, Heeren acknowledged the difficulties that arise when figuring out exactly how these topics and skills should be embedded into a CS curriculum.

If anything, said Heeren, computer science professors are already largely talking about ethics in their own courses informally — but mandating that this sort of content has to be covered might be difficult for faculty who are simply not trained in topics of ethics. There are also limits to what a course focused on the ethical implications of computer science work could achieve.

“My previous institution had a required ethics course,” said Heeren. “... It was taught by earnest professors, and probably made a few of the kids think a lot, but it wasn’t really taken very seriously by computer science majors.

“It probably truly changed the life of some of the students. But I don’t know if [it achieved] the end that they intended.”

Even though effectively incorporating communication and ethical reasoning skills into computer science courses is complex, Heeren noted this isn’t a reason not to try.

“It is true that our curriculum is packed. But we’re not above reform,” she said. “So, the next go ’round, I can imagine … an open discussion about exactly what our responsibility is to educating. That’s a perfectly reasonable conversation to have.”

In MacLean’s lab, the culture surrounding ethics is constantly revisited and reinvented. According to her, facilitating a culture surrounding ethics hinges on the access to and discussions with people engaged in HCI from a variety of backgrounds and disciplines.

“One of the really interesting things [is learning how] other research cultures might be thinking about this,” said MacLean. “For example, in the Designing for People group … there’s people from the other side of campus, not the STEM side.

“They might be thinking about these issues in a very different way … and can bring some quite provocative viewpoints.”

Because of frequent challenges to perspectives and ideas, MacLean said many of the people she’s worked with have gradually changed how they think about their work.

“And of course, the students are getting that, too. So, we challenge each other. And I think the interdisciplinary thing is very helpful for that.”
UBC’s department of computer science is one of the best in the world.

As determined by the QS World University Rankings portfolio, UBC is ranked 25th worldwide for computer science and information systems programs. The department's internal evaluation for its world-class graduates includes, among other things, ethical competency.

According to Dr. Ian Mitchell, professor and head of the computer science department, the department views a successful CS graduate as someone with “strong technical, collaborative and ethical foundations.” Rather than simply teaching students how to program, the department aims to teach students how computing can be used effectively and productively for society and for their own careers.

“Our role in the department of computer science and the University of British Columbia more generally is to ensure that our graduates have the intellectual tools to tackle [issues surrounding impacts of computing] as they move out into the workplace,” said Mitchell.

However, because the computer science department has no requirements for covering ethics within their curriculum, a student could graduate without having taken a single module on ethics.

That’s not to say that leaving this sort of education out of the CS curriculum wasn’t intentional. According to Mitchell, one reason is that the field of computer science changes rapidly, providing faculty the flexibility to adapt their courses to those changes as necessary. As such, most CS courses don’t have strongly fixed curricula.

Another concern raised by some of those surveyed by Ubyssey Science was the drawbacks of forcing CS undergrads to engage with course content they weren’t interested in. In keeping computer science requirements flexible, the department hoped to minimize the number of students taking courses they would simply view as a box to tick off on the path to achieving their bachelor’s.

There are a number of opportunities for students to engage with ethics and social implications that aren’t required by the department. So many that Mitchell said he would be “surprised” if a CS student didn’t engage with ethics at least once throughout their degree.

CPSC 430, Computers and Society, is an upper-year elective that explores the interaction between society and information technology. Instead of a typical CS course, where student performance is measured through programming or proofs, the bulk of the work in CPSC 430 is done through reading, writing and group discussion.

Dr. Kevin Leyton-Brown, a professor in the department of computer science, has been teaching CPSC 430 for almost ten years and is largely responsible for the current state of its curriculum. He described the course as a “stealth social science course.”

While CPSC 430 briefly covers various approaches to ethical reasoning, Leyton-Brown said the main aims of the course were to allow students to appreciate points of view they disagreed with, understand the complexity of “thorny issues” and help them develop the skills needed to engage with ethical dilemmas in their future careers.

“My goal, very overtly, is not to tell students what to think about these questions,” said Leyton-Brown. “My goal is to pick problems that reasonable people can disagree with — and I personally think there are sensible things to say on both sides of — and to help students to appreciate the complexity of these issues.

“You get them arguing both sides with each other, so that they can see that these things are not easy and obvious, because many of them come into this class not really having thought about it. … If they see why these things are hard, and they see what it looks like to engage with them, I think that’s about as much as they can get from one course.”

Even though he viewed the content CPSC 430 covers as knowledge that computer science students should engage with, he had concerns surrounding making the course required for the degree.

“It’s pretty satisfying to say that this course should be required. You’re saying ethics is important. And you’re making anybody who disagrees with you look like an idiot, because they're saying ethics isn't important,” he said.

However, Leyton-Brown said that CPSC 430 is a tricky course to teach effectively for instructors that have never taught ethics before. And for a course where group discussion is such an important aspect, increased course sizes that could come with making it a requirement may make it less effective.

But, Leyton-Brown did think offering CPSC 430 twice a year instead of once would allow more students to engage with topics surrounding social and ethical impact.

As for the students who take the course, Leyton-Brown said that while some were uninterested in a more social-scientific approach to thinking, most students had a very positive experience. Some had even expressed to him that the course changed the way they think about their discipline.

In his time teaching CPSC 430, the enrollment for the course has increased from 50 to 150 students. This term Leyton-Brown also sponsored a student-directed seminar for students who were interested in engaging with the content on a deeper level.

The teaching staff of some required courses have also elected to include ethics modules or examples in their courses. For example, CPSC 310, Introduction to Software Engineering, contains a module on ethics. One of the topics covered in this module is the discussion of the Association of Computing Machinery’s code of ethics and professional conduct.

The module also has students reason through ethical case studies and scenarios. This term’s offering included examples pertaining to intellectual property, user data collection and code quality.

“As software becomes more and more (and more!) prevalent in society we, as developers, have to consider not just what is legal, but what is moral,” read one of the slides in the module.

In Ubyssey Science’s ethics and tech survey, some respondents cited this module as the reason for their satisfaction with the current level of instruction on potential social and ethical impacts. In contrast, another respondent wrote that when they went through the module in fall 2020, they thought it was done “super badly.”

The department is currently working on a new data science minor in partnership with the statistics department. As part of the minor, there will be a new, optional upper-year course on ethics in data science, adding to the department’s ethical electives.

When asked if it is fair to put ethical responsibility on individual computer science students in the first place, Mitchell said that the department should be instilling a sense of “knowledge and agency.”

“Students need to understand these implications, so that they can spot the situations where perhaps things are being done in a way that is not ethical, or could lead to significant negative impacts on society,” said Mitchell.

“… I want to make sure that students understand what the basis of ethical reasoning and social good are. And when they go out into industry … if they encounter a situation like that, they can make an informed choice.”

UBC isn’t alone in grappling with instilling ethics in computer science, but it can learn from other successful programs. One prominent example is the Embedded EthiCS program at Harvard University.

The program started when a group of undergraduate students approached Harvard professors Dr. Barbara Grosz, known for her groundbreaking contributions to natural language processing and the advancement of women in science, and Dr. Alison Simmons, who specializes in philosophy of the mind, about involving ethics in the CS curriculum. The two then put their heads together to develop a framework that included ethical issues in the curricula without being overly difficult for CS faculty.

From there, the Embedded EthiCS program was born — an initiative that pairs philosophy graduate students with computer science faculty members to develop modules on ethical issues that could emerge naturally from the technical content covered in class.

“So that in the best-case scenario, it doesn’t feel like something that's being added on in some ad hoc fashion, but it’s something that is actually really relevant to one’s course of study in computer science,” said Dr. Jeff Behrends, lecturer of philosophy at Harvard and co-director of Embedded EthiCS.

The developed course modules are all open source. Some topics include ethical tradeoffs in system design for an operating systems course, tackling censorship by compromising privacy in an advanced computer networks course and the ethics of natural language representation in a systems programming course.

According to Behrends, both student and faculty response to the initiative has been “tremendously positive.” Adopting Embedded EthiCS modules into a course is entirely up to the discretion of the course’s teaching staff and as of this academic year, the initiative is operating in roughly half of the undergraduate computer science offerings in any term.

“So [there’s] definitely tremendous enthusiasm from the computer science faculty, who are inviting philosophers into their classroom. There’s just no way we could do this without their enthusiasm,” he said.

Because the program arose from student awareness of the issues covered, and because the reasoning and communication skills are so intertwined with the technical aspects of the class, Behrends says students are finding the modules relevant to their educational experience as future computer scientists.

“It is true that our curriculum is packed. But we’re not above reform.”

— Dr. Cinda Heeren, professor in the department of computer science

With any pedagogy — but especially so within a field that evolves so rapidly — there is the concern that what the students are learning within the modules may not be timely or applicable once they finish their degrees. To combat this, Embedded EthiCS has been developed in a way that the objectives of the modules are skills oriented. Instead of simply teaching students how an algorithm should be designed, the program focuses on empowering students to recognize a large range of ethical issues and how to evaluate them in different contexts.

“Those kinds of skills, especially the reasoning skills, aren’t topic specific,” said Behrends.

“Our hope is that for someone who’s gone through the computer science curriculum … [and] has encountered many Embedded EthiCS modules and has been given many opportunities to practice developing new skills, they’ll just be able to do some things in professional settings, or in political settings or in whatever social settings they find themselves in which testing methods are relevant.”

Furthermore, when a module is implemented in a course, at the beginning of each term it’s offered, the course instructor and philosophy graduate fellow will update the module to reflect changes in technology and current events. The end result is a “custom-built ethics module” every term, said Behrends.

It’s important to note that the Embedded EthiCS pedagogy isn’t necessarily the silver bullet for intertwining ethics with a CS education at every institution. Harvard and UBC are different schools with different populations and different approaches to education. Despite this, Behrends noted that the Harvard team is always interested in working with other institutions to “get the goods that we’re aiming for” in whatever form works best.

Before addressing how to properly include ethics in the computer science curriculum, there’s another question worth answering: Why was this missing, to begin with?

Dr. Sharon Stein, an assistant professor in the department of educational studies whose research focuses on ongoing colonial patterns of higher education, explained that if greater coverage of ethics in CS curricula doesn’t also explain why that coverage wasn’t there before, students won’t learn from those historical mistakes.

“There’s often this move to say, ‘Okay, we’ve excluded X knowledge, let’s bring it in,’” said Stein. “… When we do that without understanding this history and context of why and how those things have been excluded, generally, when we include them we do it in very conditional, instrumental ways that [don’t] really interrupt the continuity of business as usual.”

Certain schools of thought have been devalued in computer science for reasons often rooted in Anglo- and Euro-centrism, racism, sexism and just about every other -ism you can think of. Without reckoning with the field’s past problems, an initiative to introduce ethics to the curricula may be built on top of what allowed these inequities to occur in the first place.

“I want to make sure that students understand what the basis of ethical reasoning and social good are. And when they go out into industry … if they encounter a situation like that, they can make an informed choice.”

— Dr. Ian Mitchell, head of the computer science department

Stein highlighted a few ways to help ensure incorporated equity and ethics in curricula aren’t just surface level.

First is to contextualize the area of study that is computer science at UBC into the ecology of knowledge so that it’s understood that there are many different ways of relating to the world. This teaches that while knowledge systems are indispensable, they each have their own limitations.

Second is to expand our idea of intellectual rigour so that it doesn’t just include technical knowledge, but also an understanding of the impacts and accountabilities to different communities that are affected by the knowledge a discipline produces.

And third is to facilitate a culture of humility so that when new approaches to knowledge are presented, they’re not met with hostility. According to Stein, this humility ensures that those who feel a “commitment to understanding the systemic and historical ways that our institutions, our disciplines and us personally have been complicit in social and ecological harm … can do that without … falling into a spiral of shame or guilt that doesn’t necessarily go anywhere.”

Anouk Ruhaak, a current Mozilla Fellow in residence working on data trusts, added the potential benefit from providing more opportunities within the CS education for students to interact with people in different disciplines and from diverse backgrounds.

“I think, if you look at the tech sector right now, a lot of the front-end developers don’t come from CS backgrounds, but from majors in history or some other social sciences,” they said. “But that’s not true on the very deep, infrastructural levels [of software]. But that would be interesting to explore.”

However, this is much more easily said than done.

“We have sort of been conditioned to think that our worth is premised on what we know,” said Stein. “… Especially [for] people whose positions have been universalized, it can be extremely uncomfortable, disorienting to have that [universal application] questioned.”

Beyond the difficulties that can arise when students might be forced to confront topics they disagree with, other concerns arise regarding the limitations of even including ethical reasoning and communication skills in computer science curricula.

For one, even if an individual were to receive the greatest education on ethics in the universe, there is only so much that they can control about the outcomes of their work.

“I don’t think there’s anything that anyone can do to ensure that they’re not doing something that won’t be misused by others,” said Dr. Karon MacLean, a professor in the department of computer science who researches human–computer and human–robot interaction. “A lot of us in the university environment are working on stuff that's quite fundamental and is very far from being applied. It’s going to branch and go so many different directions … so it’s just impossible to really predict that.”

Dr. Cinda Heeren, a professor in the department of computer science, also pointed out that technology developed through ethically sound mechanisms could just as easily turn sour in the wrong hands.

“You can think of it as the same technology underlies the atom bomb as nuclear energy. I don’t even know if that’s technically true, but it’s in the wielding of it, not the not the technology itself, necessarily,” she said.

(Nuclear reactors and atom bombs both utilize nuclear fission, but they differ in how the fission is controlled and how much their fuel is enriched with fissile material.)

There are also those who think that university is too late to be asking people to begin critically thinking about how their work and actions impact other people. In response to The Ubyssey’s ethics and tech survey post, one reddit user wrote that “Social studies in high school [has] already covered the part of how to be a decent citizen.”

Just because difficulties arise in including ethical reasoning in the computer science curriculum, doesn’t mean nothing should be done about it.

Ideally, high school graduates would arrive at post secondary with infallible ethical reasoning skills, but until then, Stein said that the only place educators can meet people on these topics is where they already are.

“It would be great if we start from birth to learn to be more sober, mature, discerning, accountable people,” she said. “… But we start where we’re at, and that’s all we can do, right?”

It’s also important to keep in mind that while computer science graduates start at the bottom of a corporate ladder, over time they often work their way up into leadership positions. In these positions, the value of instilling ethical considerations presents itself.

“When I look at the leadership of the tech sector now, there’s a wide variation of some that are actually showing some leadership and some, you’re just like, ‘I can't believe how clueless they are,’ in terms of what values are at stake and how to manage them,” said Sauder’s Dr. David Silver.

Because of the influence of these roles, Silver thinks that resistance to integrating ethical understanding and reasoning skills into the curriculum should be faced head-on.

“I think it’s resistance that can and should be overcome.”


Today, almost every facet of society uses and benefits from computing technology, whether it be governance or communication or education. The invention of the world wide web means you can connect with someone just about anywhere on the planet. At the same time, “a handful of Big Tech corporations now wield more power than most national governments.”

In this period where Big Tech is aiming to secure both monopoly and monopsony positions, there is not a single aspect of society that won’t be impacted by the decisions of computer scientists.

As for who might ensure that the weight of these issues is expressed throughout a computer science degree — well, it’s really up to everyone.

“It’s the job of the youth to push institutions, to challenge them, to not be caught up in the momentum of the way things have been done,” said Silver. “… Students have a responsibility because they’re the ones experiencing their education, they can see when it’s not working.

“… And it’s our job [as faculty] sometimes to say, ‘That’s a little nuts.’ Or to respond like, ‘Oh, okay, you got it.’”

“We need to be inculcating our students with the idea of being a global citizen, a part of society,” said MacLean. “We shouldn’t be training you just to be a technologist and thinking about great research topics, or how to be a really good computer scientist and code really well. We also have to be teaching you to be alert to ethics issues, so that when you go out and are practicing your trade … that you're watching out for this stuff as it develops.

“Because it’s our students who are the ones who are going to be developing applications, putting this into technology that is directly affecting people’s lives. … You’re on the frontlines. You’re the ones who can say ‘No, this is not right. I shouldn’t be doing this, this will have big societal consequences which I do not want to be part of.’

“And that’s where the line needs to get drawn.”

About