Skip to main content

Swarthmore Chosen Among 15 Institutions to Participate in Responsible AI Program

Ameet Soni

“It can often be difficult to understand what artificial intelligence agents are doing,” says Ameet Soni. “We need to have important discussions about the ethical requirements, how we think about fairness and bias and transparency, and how artificial intelligence interacts with society.”

Associate Dean of Faculty Ameet Soni regularly receives emails from the National Humanities Council (NHC) about projects supporting program or faculty development. But a few months ago, one stood out.
 
The NHC was announcing a new program, the Responsible Artificial Intelligence Curriculum Design Project, aiming to help 15 colleges and universities develop courses confronting ethical questions of artificial intelligence. Soni, who taught computer science at Swarthmore before joining the Provost’s Office, was eager to participate. After discussing it with faculty members and Provost Sarah Willie-LeBreton, Soni put Swarthmore’s application together and submitted it. 

Thanks to his efforts, Swarthmore has now been selected as one of a diverse set of research universities, historically Black colleges and universities (HBCU), and liberal arts colleges to participate in the NHC program this summer. 

“It can often be difficult to understand what artificial intelligence agents are doing,” says Soni. “We need to have important discussions about the ethical requirements, how we think about fairness and bias and transparency, and how artificial intelligence interacts with society.”

In AI-speak, “agents” can be any human or artificial interactor in a particular scenario. But they can sometimes carry out tasks with discriminatory implications for which they aren’t explicitly programmed.  
 
“When you search on Google for a job, we find that the ads that get populated on the side vary based on your gender,” Soni explains. “If a person searches for medical jobs, the machine learns certain correlations through the use of machine learning and AI techniques, and will often rank nursing jobs higher for women and surgeon jobs higher for men — that’s the algorithm learning those biases in the background.

“These are conversations that we’ve been having with colleagues in the humanities and within our department, so [the Responsible AI program] immediately excited me as something that would be great for our campus,” adds Soni, who co-piloted a course on ethics and technology with Krista Thomason, associate professor of philosophy, in 2019. “This is an opportunity to spur on those discussions and lead to courses that allow students to grapple with a lot of these important topics.”

Bridging the gap between humanities and computer science at the College, the NHC program will allow Swarthmore students and faculty to debate questions, including examples of such discriminatory algorithms, more fluently.
 
“The responsible AI initiative is sorely needed at Swarthmore so that future engineers avoid reproducing systemic racism that we have seen in AI-based surveillance, policing, and financial systems,” says Katie Knox ’22, a French and Francophone studies and computer science major from East Lansing, Mich.
  
Swarthmore's reputation for teaching innovation and its ability to engage faculty and students in interdisciplinary learning were key considerations for the Responsible AI project, says Andy Mink, vice president for education programs at the NHC.
 
“Given Swarthmore's mission, it's important to strengthen connections between different fields,” says Thomason. “[This] initiative gives us an opportunity to design a course that brings together the expertise from the humanities and the natural sciences.”
 
Soni envisions the course as being available to Swarthmore students across the humanities and within computer science.
 
“Swarthmore is situated in a liberal arts environment, allowing us to address these questions from the perspective of many different disciplines,” he says. “With the [Responsible AI] program, we’re bringing humanistic expertise into these discussions, asking students to think critically about technology. Regardless of their majors, these are important skills both for their future careers but also in thinking about their roles as citizens.”
 
This June, Thomason and Lisa Meeden, a professor of computer science whose research focuses on artificial intelligence, will represent both the humanities and STEM of Swarthmore at the NHC program.

At the end of the program, the NHC will compile all the courses developed by participating institutions, making them accessible to practitioners across the humanities and computer science. 

“We’re hoping that Swarthmore’s model will be a beacon for other institutions to bring this [topic] to their own campus,” says Soni.
 
The first Responsible AI course, which Thomason and Meeden will design at the NHC this summer, is expected to be offered in the 2023-2024 academic year. 

“The hope is that this wouldn’t be just a one-time thing,” adds Soni, “[but] a regular discussion and part of the curriculum at the College.”

Submissions Welcome

The Communications Office invites all members of the Swarthmore community to share videos, photos, and story ideas for the College's website. Have you seen an alum in the news? Please let us know by writing news@swarthmore.edu.