Kalea Cesaire (Crown ’26, computer science) is part of the generation that can’t remember—and can’t imagine—life before smartphones, Facebook, and Google.
But Cesaire, who is studying software engineering at UC Santa Cruz, also worries about the human cost of rapidly expanding technologies such as facial recognition software.
“My most pressing concerns are the erosion of privacy and the potential for increased surveillance,” said Cesaire, who is a board member of the UCSC chapter of the National Society of Black Engineers and executive intern with GraceHacks , a hackathon for female-identifying and gender-diverse students.
Hoping to confront some of the most problematic aspects of tech, Cesaire wanted to broaden her perspective and think about her future profession in new ways, apart from her usual regimen of engineering classes. This spring, she enrolled in the new Humanizing Technology Certificate Program, made possible with a $149,500 grant from the National Endowment for the Humanities .
This initiative offers a new set of specially designed Humanities General Education (GE) courses that address the moral and social implications of new technologies. Students work collaboratively to envision a different kind of tech future, one that addresses issues such as racism rather than perpetuating them.
The program has been galvanizing for Cesaire, who is planning for a career in cybersecurity.
“Many computer science and engineering majors don’t usually think about humanities or ethics outside of fulfilling their GE requirements,” Cesaire said. “When I found out about this program, I thought it sounded like a great idea to get engineering students to think about technology outside of a technical perspective.”
Cesaire hopes to complete the certificate program by the end of the summer, but she’s so impressed by the offerings that she may take more courses even after reaching that goal.
“So far my main takeaway from these classes is that race and racism are ingrained in almost all technologies we use today and we don’t even notice,” said Cesaire. “Anything created in a society built upon race and racism is bound to be based upon those same standards as well. Emerging technologies, such as facial recognition, biometric data collection, and inescapable data tracking, raise the potential for abuse, including the misuse of personal information of consumers.”
A unique partnership
UCSC Humanities Dean Jasmine Alinder, the principal investigator for the Humanizing Technology initiative (Photo by Carolyn Lagattuta)
Humanizing Technology is the result of a cross-disciplinary partnership among the Humanities Division, the Baskin School of Engineering, The Humanities Institute (THI), and the Teaching & Learning Center, a merger of the former Center for Innovations in Teaching & Learning and Online Education.
UCSC Humanities Dean Jasmine Alinder, the principal investigator for the Humanizing Technology initiative, spoke about the importance of humanistic solutions to pressing tech issues.
“Our goal for the certificate program is to explore how humanistic training can help students understand the impacts of new technologies, and train students who can attest to the relevance of humanistic thinking, not simply for their occupational life, but for navigating their values and place in the world,” Alinder said.
“We want students to have opportunities to grapple with ethical, cultural, and social questions that do not have clear ‘right’ answers, particularly so that our students who wish to work in the area of technology in the future have an opportunity to think beyond optimization, the race for innovation, and corporate bottom lines,” said Jody Greene, professor of literature and outgoing Center for Innovations in Teaching & Learning director. Professor Greene initially proposed the development of a Humanities Certificate for Engineers in 2019 and is a founding director of the Teaching & Learning Center.
While this certificate program was designed especially for early-career engineering students, it is open to all UCSC undergraduates, who can take discussion-based GE courses including Race and Technology, Humans and Machines, Ethics and Technology, Technologies of Representation, and Language Technology.
This fall, the certificate program will add Artificial Intelligence and Human Imagination, a new course that Zac Zimmer, associate professor of literature, has designed as part of the Responsible Artificial Intelligence Curriculum Design Project from the National Humanities Center.
So far, the program has drawn 64 engineering students, most of them computer science majors, while also attracting particular interest from students majoring in the social sciences—especially psychology and economics. Interest in the program is strong—so far, 176 students have participated—but each class has limited enrollment to maximize small-group discussions.
The hunger for this kind of learning suggests a desire for the nuanced dialogue and analysis that humanities courses make possible, said Jim Whitehead, professor of computational media and associate dean for undergraduate experience at the Jack Baskin School of Engineering.
“Humanities gives us an incredibly rich set of lenses for exploring each new technology, be it traditional power dynamics, a wealth of different identities and their intersections, the history of technology, and rhetorical analysis,” Whitehead said.
This initiative is just one way in which the Humanities Division has been delving into the implications of technology. Technology is also the theme of The Humanities Institute for 2023–24, linking together public events like The Deep Read and Questions That Matter , as well as a wide-ranging set of research clusters.
Taking a deep dive into the humanities
For most of his life, Thomas Toy (Oakes ’24, computer science) has been preparing for a career in tech. Before transferring to UCSC, he was part of a robotics club at Skyline College in San Bruno, Calif. But lately, he’s been wondering about the unforeseen negative effects of technologies that are expanding beyond human control.
“My most pressing concern about the impacts of emerging technology is how AI is rapidly taking jobs away,” Toy said.
According to recent reports, generative AI could lead to lower wages and layoffs, especially in administrative and legal professions . The Washington Post ran a news feature about the repercussions of those job losses .
“Of course AI won’t take all jobs away, but I also feel like it will start lowering the salary for some tech jobs,” Toy said. “Things like [the artificial intelligence chatbot] ChatGPT can write a lot of code when prompted correctly, and with how rapidly it’s improving, I can foresee a future where a lot of the current tech jobs will no longer need to be done by humans.”
Before enrolling in Humanizing Technology courses, Alexander Tang (Cowell ’24, network and digital technology) had a few concerns about tech, but they were limited to misgivings about bitcoin and AI. But the courses in the new certificate program made Tang consider the ways in which discrimination can become built into tech over time.
“There are a lot of issues you need to consider if you want to create tech that is truly inclusive and free from bias,” Tang said. “Certain forms of technology are ‘invisible’ because of how normalized they are. The way we construct new technologies may be built on biased tech—encoding those biases.”
Linguistics professor Pranav Anand demonstrates book making as part of the Language Technology course. (Photo by Carolyn Lagattuta)
In referencing “invisible tech,” Tang was referring to forms of technology that employ artificial intelligence but have no user interface, so it’s impossible to see them at work. This includes algorithms that use information based on user clicks and past purchases to curate search results as well as social media content.
The Humanizing Technology courses have also drawn students from other areas of study outside computer engineering. Indigo Price (College Nine ’26, psychology and philosophy) said that the Ethics and Technology course taught by Mark Howard (M.A.’22; Ph.D. cand. politics) was inspiring and career-changing.
Price, who uses they/them pronouns, has always been immersed in technology. Price’s father is a Microsoft employee, and their mother works in cybersecurity.
But the class has made Price think harder about the implications of living in a society where a total disconnection from technology is impossible.
“We’ve now reached the point where we can’t not use these things,” Price said. “But the class has made me think much more about the hidden costs of these so-called ‘free’ technologies. Facebook, for example, is free to download, free to share on, but if it directs your eyes and your attention to ads that are used by the algorithm, and which influence your decisions, you become a product that is being sold.”
Mark Howard, Ph.D. candidate in the Social Sciences and Humanities divisions and a Humanizing Technology instructor, introduces his students to readings about the often invisible downsides of tech and invites them to use cutting-edge tools to study its limitations. (Photo courtesy Mark Howard)
Price and the other students did a close analysis of Sasha Costanza-Chock’s influential book Design Justice: Community-Led Practices to Build the Worlds We Need, which shares the stories of marginalized communities that have developed their own design and technology practices, along with “best practices’’ for technology workers who are hoping to make positive changes.
Because of Humanizing Technology, Price hopes to find work as a technology ethicist who consults with companies to make sure their tech practices are socially and ethically responsible.
Changing career paths and opening minds
Mark Howard, Price’s Humanizing Technology instructor, is doing more than just introducing his students to readings about the often invisible downsides of tech. He is also inviting them to use cutting-edge tech to study its limitations.
In his course this past spring, Howard gave his students a dense and difficult philosophical text to analyze, while allowing them to use ChatGPT to help develop their analysis.
Allowing ChatGPT to do the work for them, some students exploited the tool as a shortcut, giving them access to the text’s headlines, but missing out much of its nuance and deeper meaning, Howard said.
Worries about propaganda arose during the spread of the printing press in the 15th and 16th centuries and the advent of radio broadcasts in the 20th century, creating the legal and ethical frameworks we are now using to evaluate social media, says Pranav Anand, professor of linguistics and faculty director of The Humanities Institute. (Photo by Carolyn Lagattuta)
“So yes, you get an analysis of the text, but you miss so much of the detail, even though you’ve had access to the most remarkable information processor that’s ever been made available to you,” Howard said. “I sensed how much that sank in with them. I might be wrong, but I really think that from that point on, they will use ChatGPT judiciously.”
Howard has noticed even more significant influences on some of the students’ lives. One computer science student, after enrolling in Howard’s course, decided not to take a tech job that troubled him for ethical reasons.
The classes also reach into the past to give students a broad context for talking about 21st-century tech concerns. “Worries about viral content, misinformation, and censorship have been around for centuries,” said Pranav Anand, professor of linguistics and faculty director of The Humanities Institute, who is teaching a Language Technology course as part of the program.
Anand is co-principal investigator for the Humanizing Technology initiative, along with Porter College continuing lecturer and THI’s program manager of the Deep Read Laura Martin (M.A.’08; Ph.D. ’12, literature). He pointed out that worries about propaganda arose during the spread of the printing press around the globe in the 15th and 16th centuries and the advent of radio broadcasts in the 20th century, creating the legal and ethical frameworks we are now using to evaluate social media.
“When we are trying to understand a piece of technology, many of the most important factors are not technical but the economic, social, and political systems in which that technology was created,” Anand said. “Those are the main drivers of the perennial challenges of all technologies in human history: systemic bias, exclusion, differential access, and social disruption. And any attempt to mitigate those challenges can only begin by understanding the enduring forces behind them.”
View more information about the Humanizing Technology Certificate Program, including a series of brief videos outlining the goals and impacts of the project.