Skip to content

Latest commit

 

History

History
executable file
·
145 lines (116 loc) · 16 KB

index.md

File metadata and controls

executable file
·
145 lines (116 loc) · 16 KB
layout exclude
home
true

Ethical Reflection Modules for CS 1

top logo Image by Balu Ertl

Activity Quick Link Programming Topic
Developers as Decision-Makers Conditionals
Developers as Gatekeepers Functions & Data types
Developers as Future Makers For Loops & Lists
Developers as Image Manipulators Nested Loops & 2D Lists
Developers as Prioritizers OOP / APIs

In Fall 2019, I redesigned our CS 1 course to integrate practice-based (coding!) reflection directly with technical concepts. This is a space to share those activities. Their goal is to:

  1. Introduce a deeper level of reflection in CS 1 courses. I want students to see that their actions either directly or indirectly impact people, communities, and cultures, and that this impact is often not felt equally by different groups of people (along lines of gender, race, class, geography, etc.).
  2. Develop reflection habits alongside coding habits - all modules involve programming! I believe that habits are formed early in CS and must be tightly coupled with technical concepts in order for them to stick.
  3. Pair directly with existing CS 1 curriculum - CS 1 is already a busy course. You don't need to set aside a month of new material. I believe that reflection and responsible computing pairs directly with technical concepts already taught (conditionals, for loops, etc.)

What these activies are not:

  • They are not a replacement for teaching students issues of cultural competency and identity. While computer scientists can (and should) point to those issues in class, most of us are not the experts. Students should be taking courses that directly speak to the structures of power that they will be introducing systems into (including gender / race / ethnicity / class / geography / etc.)
  • They do not teach students what the correct design is. They prompt students to reflect on the human consequences of their decisions. Sometimes, students answer I'm not sure I can design this well enough to prevent harm. That's a great answer too. Choosing not to build something is okay.

Note: If you are looking for the old homepage of this site, click this link


Programming + Reflection Activities

housing algorithms What are the consequences when we turn people into numeric scores for algorithms? Who benefits and who are disadvantaged by our decisions?

This assignment appeared as part of ACM SIGCSE'S Nifty Assignments track. You can cite that work with:

Nick Parlante, Julie Zelenski, John DeNero, Christopher Allsman, Tiffany Perumpail, Rahul Arya, Kavi Gupta, Catherine Cang, Paul Bitutsky, Ryan Moughan, David J. Malan, Brian Yu, Evan M. Peck, Carl Albing, Kevin Wayne, and Keith Schwarz. 2020. Nifty Assignments. In Proceedings of the 51st ACM Technical Symposium on Computer Science Education (SIGCSE '20). Association for Computing Machinery, New York, NY, USA, 1270–1271. DOI:https://doi.org/10.1145/3328778.3372574


input validation What assumptions do we make about the people using our technology? What are the consequences of those assumptions? - who might we exclude? How do we capture diversity through design?


ethical hiring What does it mean to design a fair algorithm? What is the human cost of efficiency? What systemic advantages/disadvantages are your algorithms likely to amplify?

This assignment appeared as part of ACM SIGCSE'S Assignments that Blend Ethics and Technology special session. You can cite that work with:

Stacy A. Doore, Casey Fiesler, Michael S. Kirkpatrick, Evan Peck, and Mehran Sahami. 2020. Assignments that Blend Ethics and Technology. In Proceedings of the 51st ACM Technical Symposium on Computer Science Education (SIGCSE '20). Association for Computing Machinery, New York, NY, USA, 475–476. DOI:https://doi.org/10.1145/3328778.3366994


averaging faces How does representation in a dataset impact an algorithm's outcome? Is it possible to create a representation that treats all people fairly? What are the possible implications of facial recognition software when it is used on historically marginalized groups?


rescue What is 'moral' behavior in the context of a computer? How do we write code that is forced to assign value to people? What are the implications of our representation decisions?

While not peer-reviewed, people have pointed to my reflection on Medium when looking to cite this work:

Evan Peck. 2017. The Ethical Engine: Integrating Ethical Design into Intro Computer Science. https://medium.com/bucknell-hci/the-ethical-engine-integratingethical-design-into-intro-to-computer-science-4f9874e756af


License

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License