Socially Responsible Computing TA Program

Through the Socially Responsible Computing Teaching Assistant (STA) program, STAs are assigned to individual courses and work closely with course staff to develop content connecting the technical material in a class to its social context. The goal of the STA program is to make critical thought about the social impact of technical systems an integral part of the design and development process. The purpose of embedding SRC content throughout the CS curriculum, rather than in a standalone course, is to repeatedly expose students to the social, political, and ethical aspects of computing and directly relate these issues to technical choices.

Why is social responsibility important in CS?

  • As the role of technology increases in nearly every area of life, the work of computer scientists can increasingly exacerbate systems of social inequity.
  • Technical decisions, if widely adopted, can intentionally or inadvertently function as policy decisions, and computer scientists are often given broad leeway to make those decisions.
  • Computer scientists must understand the societal context of their work and the social implications of different design choices.
  • Those who work in tech must consider the consequences of their work and collaborate across disciplines to avoid creating technology that is harmful or unjust.

What goals does the program have for students?

  • Recognize that technology is not neutral, nor does it exist in a vacuum; it contains built-in biases that reflect the preferences, norms, and worldview of its creators.
  • Build with everyone in mind.
  • Understand and fulfill the responsibility to advocate against unethical product or research decisions.

How does the program work?

Each course that opts into the STA program is assigned 1-2 STAs. STAs work closely with course staff to develop content connecting the technical material of a class to its social context and consequences. Each course is expected to touch on responsible computing multiple times during the semester, at least once as part of a graded assignment, with coverage starting earlier in the semester (rather than as an add-on at the end). In the past, STAs have embedded content in lectures, written questions, reading assignments, labs, and larger technical projects. See the Course table below for more details.

Frequently Asked Questions

How did the STA program begin?
In 2019, the Brown CS department started the “Responsible CS” initiative to integrate ethics and social impact topics broadly across its curriculum. The STA program, formerly called the Ethics Teaching Assistant (ETA) program, was the first step in this initiative.

Why was the role renamed from Ethics TA to Socially Responsible TA?
Our focus on Socially-Responsible Computing is broader than ethics: we include a combination of understanding power and technology, making ethical design decisions, building accessible systems, and testing systems for (un)desirable impacts on various stakeholders. We want students to make decisions based on social, historical, and personal perspectives that are not necessarily part of traditional "ethics" frameworks.

How does the STA program interact with the courses and professors?
For each course that the program works with, one or more STAs work as members of the course staff to integrate ethics content into the curriculum. The work of integrating content is a collaborative effort between the STAs, professors, and rest of the course staff.

How can I get involved as an STA?
ETA hiring occurs at the same time as regular TA hiring. The MTAs will send out an email the semester before with information about applying for both UTA and STA positions. Many of the STAs serve both UTAs and STAs!

How much are STAs paid?
The hourly wages for STAs are calculated with the same formula as regular TAs.


Course Title Responsible Computing Topics Content Format
Computing Foundations: Data (CS0111) Introduce students to how people’s data exists in the world (in large, unregulated quantities), how data represents the world (incompletely; not objectively), and how data is used in the world (usually in conjunction with other datasets; bought & sold) and who decides how it is used. A progression of methods for matching ads to users across multiple programming assignments. Short readings and written reflections associated with many homeworks. One peer review assignment. Two course projects augmented with a reading and reflection connected to the project theme. Short lecture discussions.
Computing Foundations: Program Organization (CS0112) Expose students to contemporary social problems that underlie the application of the technical material in communities around the world. Specific topics include the cultural rifts caused by linguistic imperialism, whose effects are magnified with the scalability of programs; the obligation of coders and researchers to maintain high standards of accuracy and epistemic humility when modelling real-life phenomena; and the opportunities--as well as concerns--that come with web scraping. Written assignments are embedded within the four projects, allowing students to delve deeper into the specific technical content and its corresponding societal, environmmental and cultural impacts. Students also iterate through a series of peer review assignments to encourage learning from multiple perspectives and to allow for each student to receive feedback from multiple sources.
Intro to Object Oriented Programming (CS0150) Introduce students to a broad range of topics in responsible computing, especially those relevant to news and current events. Engage students on a smaller range of topics through activities and discussions, and promote critical thinking around positionality, bias, and computing. Specific topics include regulation, moderation and misinformation, culturally-informed biases and their impact on real-world products, addictive technology, and issues in the tech industry as a whold. Biweekly mini-lectures on news topics, activities in every lab and section, and two extra-credit assignments.
Introduction to Algorithms and Data Structures (CS0160) Introduce students to algorithms and data structures within real-world, socially relevant contexts. Promote critical thinking about ethical considerations and algorithmic fairness for all the code they write. Specific topics include bias, photomanipulation, and the spread of disinformation. Projects all have a social premise as well as conceptual ethics questions that ground the assignment within realistic, socially relevant problems. Sections involve discussions and activities surrounding ethical issues. One homework assignment centers on algorithmic fairness.
Computer Science: An Integrated Introduction (CS0180) Introduce students to a framework of social threat modelling to encourage systematic approaches to identifying the impacts of their work, including the inherent biases of predictive algorithms, OOP privacy considerations, cultural and linguistic assumptions in textual analysis, and resource consumption. These topics relate to the wider learning goal of identifying potential societal impacts of a program on resources (such as time, space and energy), users, and society. Socially responsible computing is covered in multiple lectures, projects and homework assignments. Assignments range from written reflections based on assigned articles that relate to in-class work, to coding exercises that demonstrate a program or algorithm's adverse effects.
Accelerated Introduction to Computer Science (CS0190) Introduce students to the impacts of software on the real world in a broad but rigorous context. Topics include, but are not limited to, adversarial thinking, data aggregation, names, user handling, D&I, data voids, neutrality. Written assignments embedded in programming assignments.
Fundamentals of Computer Systems (CS0300) Intoduce students to social and ethical issues related to computer systems. Topics include data privacy, socially responsible system design, security, and accessibility. Embed SRC questions in each assignment with 1-3 short readings and written questions
Introduction to Software Engineering (CS0320) Through SRC content that is much more integrated with and around code than in most classes, teaches students about accessibility, user handling, security, and privacy. Written reflections and coding assignments.
User Interfaces and User Experience (CS1300, Fall 2019) Teach students about implicit value judgements that come with design choices. Practice considering ethics during the design process. Topics include Microsoft Guidelines for Human-AI interaction, unethical metrics for A/B testing, dark patterns, accessibility, and bias in sampling. Embed ethics questions in each assignment. Incorporate ethics content in some lectures and in-class activities. In one assignment, students choose ethical values for assessing interfaces and create one model that aligns with these values and one that operates against them.
Creating Modern Web & Mobile Applications (CS1320) Introduce students to several ethical considerations in web and mobile development, paying specific attention to accessibility, privacy, and long-term impact. Give students opportunities to work through technical solutions to these issues, and encourage students to think critically about design choices and their impact. Update each assignment and lab to include accessibility-related technical requirements. Add readings and written questions to a few assignments.
Artificial Intelligence (CS1410) Explore social and ethical issues related to Artificial Intelligence. Topics include the unforseen consequences of shortest path algorithms, dark patterns and government regulation, gender bias in speech assistants, machine ethics and autonomous vehicle liability, racial bias and healthcare decision making, and fairness in machine learning. Each homework assignment includes an SRC section with 1-3 short readings followed by 2-3 written questions. STAs lead optional discussion sections focused on each homework and students get extra credit for participating. One lecture focuses on fairness in machine learning.
Machine Learning (CSCI1420) Introduce students to topics and frameworks to approach thinking about social issues and ethics in machine learning applications. Emphasize critical thinking of the social implications in processing data, writing algorithms, and post model analysis as a fundamental part of the machine learning process. Connect students to resources to learn more about research and applications in machine learning ethics. Topics include algorithmic fairness, explainability, bias, and representation choices. Readings and responses about the ethical limitations of machine learning models, case studies about social issues surrounding machine learning applications, new datasets and projects to illustrate topics like algorithmic fairness in practice. Explore fairness and explainability in lecture.
Computer Vision (CS1430) Introduce students to a broad number of social and ethical issues in Computer Vision, tackling problems such as photo manipulation and dataset limitations. Expose students to applications of Computer Vision that have social and ethical consequences and encourage them to think critically about them. Each project includes written questions and relevant readings targeting societal issues relevant to the rest of the project. Expand on some topics in lecture. Final project includes a Socio-historical Context and Impact Report.
Deep Learning (CS1470) Explore the societal and ethical implications of deep learning technologies that map directly to course concepts as well as broader systems-level concepts. Topics included interpretability, fairness, energy consumption, privacy, and deepfakes. Weekly homework assignments include written SRC questions. SRC content is integrated throughout lectures. One lab covers reducing gender bias in language models. Final project includes an ethical implications reflection.
Introduction to Computer Systems Security (CS1660) Encourage students to consider the broader implications of computer systems security, from the perspective of users, software developers, and policymakers. By covering case studies about legal backdoors to encrypted systems, responsible disclosure, firewalls, and more, we hope to teach students to be mindful of their own security and privacy, as well as the power they hold over others’ security and privacy as computer scientists. Integrated preliminary discussion topics into the lecture slides, as well as written longer questions for each of the six homeworks. These questions ask students to consider recent case studies, how users are impacted, and what the responsibilities of users, developers, and policymakers are in these cases.
Computational Molecular Biology (CS1810) Identify and discuss the societal and ethical implications of computational biology techniques. Topics include genomic privacy, demographic bias in genome wide association studies, bias in reference genomes, and the social implications of genetic markers. Each homework assignment includes a reading and a few written questions that connect the assignment and lecture material to a societal issue or real-life scenario. The final exam included an SRC question that asked students how they would respond to a socially-relevant scenario.
Data Science (CS1951A) Identify and discuss societal issues, best practices, and critical questions that data scientists should consider at each stage of a project. Demonstrate that data and its analysis are not inherently objective or fair, but rather have the potential to shape or magnify biases in judgements or processes of societal discrimination. Each homework includes a few written questions and readings. The semester-long final project includes a Socio-historical Context and Impact Report. There is one lab about ML fairness and one lab about Data Privacy. One lecture focuses on ethical and social issues throughout the data lifecycle.


Spring 2021

  • CS0150: Shira Abramovich, Ellie Madsen
  • CS0180: Gregory Dahl, Suyash Kothari
  • CS0300: Zeynep Aydin, Jenny Tan
  • CS0320: Solomon Boukman, Nick Young
  • CS1420: Rajyashri Battula, Dat-Thanh Nguyen
  • CS1430: Reet Agrawal, Neil Sehgal
  • CS1660: Abigail Siegel, Willem Speckmann
  • CS1951A: Lena Cohen, Gaurav Sharma
  • Head STAs: Lena Cohen, Shenandoah Duraideivamani
  • General STA: Issra Said

Fall 2020

  • CS0111: Shira Abramovich
  • CS0112: Suyash Kothari
  • CS0190: Nick Young
  • CS1410: Neil Sehgal, Amrita Sridhar
  • CS1470: Dybe Fredy Mwaisyange, Naomi Lee
  • CS1810: Hossam Zaki
  • Head STAs: Lena Cohen, Shenandoah Duraideivamani

Spring 2020

  • CS0111: Livia Giminez, Tzuhwan Seet
  • CS0112: Amanda Lee, Eli Morimoto
  • CS0160: Tzion Jones, Jessy Ma
  • CS0180: Lena Cohen, Shenandoah Duraideivamani
  • CS0320: Kiran Merchant, Heila Precel
  • CS1320: Shira Abramovich, Jamison Wells
  • CS1420: Karen Tu, Kelvin Yang
  • CS1430: Katie Friis, Isabella Ting
  • CS1660: Hannah Chow, Shawna Huang
  • CS1951A: Huayu Ouyang, Ben Vu
  • Head STAs: Jessica Dai, Stanley Yip
  • General STA: Aaron Zhang

Fall 2019

  • CS0111: Lena Cohen, Tzuhwan Seet
  • CS0150: Kendrick Tan, Rebecca Zuo
  • CS0170: Signe Golash, Heila Precel
  • CS1300: Andy Rickert, Stanley Yip
  • CS1470: Jessica Dai, Hal Triedman
  • Head STAs: Jessica Dai, Stanley Yip


  • Faculty Advisor: Kathi Fisler
  • External Advisor: Théo Lepage-Richer
  • Faculty Director: Ugur Cetintemel

Questions? Contact us at