Profiles | 4 October 2023

Grad Profile: Dustin Gray

Share

Dustin Gray is a PhD candidate in Philosophy at UC Santa Cruz. His dissertation focuses on humans’ relationships and reliance on modern technologies. His work argues that the human compulsion to exercise power has led to the development and implementation of new technologies. Gray served as part of the inaugural Humanizing Technology Fellows cohort in 2022-2023, teaching the course HUMN 25 – 01, “Humans and Machines,” in Winter 2023. This summer, we learned more about Gray’s research, his experience collaboratively developing his “Humans and Machines” course, and his pedagogical approaches to emerging questions about humans and technology. 


Hi Dustin! Thank you for chatting with us about your ongoing research! To begin, would you provide us with an overview of your dissertation project and what you are focusing on right now?

As I see it now, my dissertation will take the form of a critique/analysis of human reliance on modern technologies. I find that though many people concede that the use (and especially the overuse) of devices like computers, tablets, and smartphones tend to distract and disconnect them from the “real” world, the same users engage with them regularly. I want to argue that within an advanced technological society such as ours, we live in a landscape where there is no alternative but to engage with these modern contrivances. My critique will expand this view beyond devices and focus on that which drives their use. This could be anything from social media to online dating to AI driven large language models such as ChatGPT and GPT4. In doing so, I will argue that devices can more accurately be described as physical tools used to access non-physical data. The analysis of value we place on both might link to the notion of mind and body, or if you like, software and hardware, which is a comparison often made.

My hope is that the work done in this project will offer deeper insight into the relationship that humans have with machines and the fascinating connection that is made between something that is conscious and something that is not, (though the latter seems to be up for grabs as AI advances further).

I plan to lean into Friedrich Nietzsche’s insistence that the mark of a great human is her “will to power.” This being the notion that humans are instinctually driven to towards the most favorable conditions of existence. He further notes that in many cases, this leads to our greatest miseries. It is this unavoidable compulsion to exercise power that I will argue has led to the evolution of modern technology and our parallel evolution with it.

You are currently preparing your essay “The Will to Submit: Surveillance Technologies and their Impact on Academia” for publication in a volume prepared by the Center for Values in Medicine, Science, and Technology at UT Dallas. Could you share what are, for you, some of the exciting debates happening for contemporary scholars in the fields of data privacy and the ethics of AI?

Students should be given the choice to opt-out of the use of surveillance driven technologies such as Google; this [should] be a fundamental right afforded to all who work in academia and beyond.

What I find particularly compelling about this work is that there is not much being discussed about it in the philosophical community at large. Of course, many are talking about issues of online privacy and how agreeing to privacy policies and terms of use undermine the notion of informed consent. However, my argument is not that this is a problem, for it clearly is. My argument is that academic institutions that mandate the use of these technologies are doing so to the detriment of their students. I maintain in this piece that students should be given the choice to opt-out of the use of surveillance driven technologies such as Google; that this be a fundamental right afforded to all who work in academia and beyond.

Congratulations on being named as an inaugural Humanizing Technology Teaching Fellow in 2022-2023! As part of the program, you taught a newly developed course “HUMN 25 – 01   Humans & Machines,” in Winter 2023. Could you share a little about participating in the pedagogical design institute and your personal process for envisioning and designing the course? 

We spent last Summer designing this course. In doing so, I was able to draw from a wealth of experience offered not only by our faculty advisor, but also from other grad students whose research centered around fascinating topics. The collaboration was of an interdisciplinary nature which encouraged everyone to look outside of their particular disciplines to create innovative curriculum. I won’t lie, this had its challenges, but in the end, I think we were all able to learn from each other and in doing so, create courses that bring a multitude of otherwise unrealized strengths to the drawing board. 

Your pedagogical experience includes teaching “Science and Society” in the Philosophy Department, a course in which you explored the explosive growth of networked digital information technologies and even sought to engage students about the lasting psychological effects resulting from excessive exposure to social media platforms. I imagine your “Humans & Machines” class addressed similar questions. I’m curious what challenges or insights have come up as you have explored these topics with students, many of whom are part of Gen Z, and have grown up surrounded by and intimately attached to digital technologies and social media? 

This is the beauty of working with students with tech and engineering backgrounds, we had so much to learn from each other.

To be clear, I was a TA for “Science and Society.” Though I will say that the professor I worked under brought to light many of the fundamental concepts that have guided the path of my current research.

Regarding the growth of networked digital ITs, I think that Gen Z students grasp their intricacies much easier than someone from my generation “X” do. I wager that this has to do with the fact that Gen Zers have lived with the internet their entire lives. It was already a working part of society out the gate (as you mention). This has everything to do with the notion of “technological evolution” and our parallel evolution with it. There is a give and take. We develop these technologies to make our lives more efficient, however the implementation of them in our lives creates a constant need for further expansion. This is what I feel best explains the accelerated growth of this type of tech.

I can’t say much about challenges, but regarding insights, there seems too much to say. The students I taught offered a vast amount of information about the way that modern technologies function. Not only that, but they were also quite aware of the theory that drives technological advancement. This is the beauty of working with students with tech and engineering backgrounds, we had so much to learn from each other.

What is one moment from teaching “Humans & Machines” last Winter that really stands out to you? 

In one section of the course, we focused on computational machines. Specifically, we addressed the question of whether machines can think. This is something that was explored in Alan Turing’s famous 1950 essay, “Computing Machinery and Intelligence.” Though he never claimed outright that machines (in this case, computers) could think, he did predict that by the year 2000, “one will be able to speak of machines thinking without expecting to be contradicted.”

The discussion that came from this centered not so much around whether computers could think but rather if they think the way we think. We then started to wonder if we could even accurately describe the functions of our own human thinking. So, then the question became, do computers think the way we think we think?? This is what I have found to be so valuable; that we really are considering machines and their relationship to humans – as the course title suggests.

Finally, what is your favorite spot on the UCSC campus?

Definitely the Cowell College Dining Hall.


Banner Image: Mural of the Google logo in on the Pulaski bridge in New York depicting the two O’s from the company’s name as a pair of surveillance cameras.