Tuesday, May 1, 2018

What Ethical standards SHOULD be in place - EthicalCS Day 3

After the weekend, I wanted to put a bow on everything we had talked about.  Going back to this tweet, and more specifically, the whole thread...


... I asked students to develop their own code of ethics for computer scientists.

First, we warmed up with this 60 Minutes video on The Coming Swarm which raises the issues of AI in the military.  It is very different from the data story we were talking about before, but it still (explicitly) raises some ethics questions around computer science.

I asked students to consider "Is automation a good thing or bad thing for wars?".  For the most part, students saw the challenge in this question - it is a "good thing" if we are the ones with the automation, but as more and more people get the technology... it gets iffy.  Through the video and the tweet, I talked about how the atomic bomb was a "good thing" until we saw how horrible it is.  The people who helped create the atomic bomb have some ethical obligation to society, once again, we didn't recognize the monster we were creating.  Cold that be happening right now?

From there we transitioned into talking about ethical oaths.  I gave students this hand out which we read together to connect ethical oaths to something students are already familiar with a bit in the Hippocratic oath.



Students were then tasked with writing their own set of ethical laws for computer scientists.  I walked the room while they had this discussion.  It was FASCINATING to watch privilege pop up in these discussions.  One student was adamant that this could be solved if we instead have people pay Facebook to keep their data private.  This started a good conversation around thinking about who all has your data - are you willing to pay all of them?  What does that mean for people who cannot afford to pay to keep their data private?  Is privacy a privilege and not a right?

I could have had 38 different conversations about what students were thinking, but at the end here were some of their more interesting "oaths for computer scientists"



Here were some common themes:

  • As we shared them out, it was clear that a lot of the "oaths" were at odds with core business ideas.  They weren't bad rules to live by, but they were also unsustainable for tech companies.
  • It was interesting that a lot of students put down something about making terms and conditions easier to read.  This seems do-able and there are precedents for this sort of thing.  Nutritional information was standardized to be easier to read and understand in the US.  Looking at the labeling of cereal boxes gives you the big picture of how "healthy" the food is.  There is also a movement to do this on medication that you get from the pharmacy.  They have labels that are easier to read and again have symbols that show the most important information on it.  There is some movement in this direction with the Creative Commons License labeling. 
  • Many students recognized the need to diversify the industry.  It was encouraging for students of all backgrounds (including the white male students) to recognize this.  I realize actually DOING something about it is different than recognizing it, but I think it is a solid first step.
  • Students seemed to hold three different stakeholders responsible: corporations, individuals and the government.  It strikes me that this is pretty similar to the financial industry too.  It was a bit surprising that some students placed more responsibility on the individual/consumer.  I wonder if that is the easy way out a bit - to blame the individual, rather than question the system that has led that individual to put be put into that situation.

We shared out some of these ideas and poked at them a bit together.  I had them turn in their sheets so I could read them a bit more.

Finally, after hearing from the Twittersphere, I asked students to complete the following sentance on their sheet of paper:

Ethics in computer science is...

Here were some of their responses:









Next Time Around
If I were to do this again, I think I need to slow down a bit and tell a story and consider a final product.  I thought of the ethics rules as a final product, but students weren't as motivated as I had hoped.  I am wondering if consolidating these into one code of ethics would have helped students fight a bit harder for or against some of these regulations.  I wonder if I could even have them send a letter to a representative, a consumer protection bureau, a researcher, or company to try to convince them to adopt a ethics rule that they came up with.  Perhaps that would make the audience more authentic.

I also think I could down portions of this mini detour.  Having students read the medical ethics and then go straight into developing their own code was a bit abrupt.  I was hoping to prime them a bit for the activity, but I think I should have slowed down and asked students "why do you think this rule was put into place?".  There is a great amount of history behind the rules, and many of them are in reaction to a societal issue, perhaps I could have teased that out of them a bit more.  Then they might be ready to consider what societal issues technology (or the industry) might cause.  I gave them a bit of a list of "things to consider" but some students relied solely on that list.

Another detour of the detour could have been looking at the ethics of the scientists that build the atomic bomb.  Students seem to get the need for ethics in CS, but I think this story might make them see it as imperative rather than a "nice to have" type of thing.

I am curious as to what others do out there!  How do you make ethics come alive in a high school computer science class?  What is fair to expect of students? 

There were certainly times that students expressed frustration that they would rather be programming during this time.  At the same time, I know this is good for students (and perhaps society-at large) to consider as well.