Technology

Chat GPT: Fear and Loathing

I wanted to spend some time thinking through the fear and loathing ChatGPT generates in the academy and what lies behind it. As such, this post is less a well written essay than it is a cocktail party of ideas and observations waiting for a thesis statement to arrive.

Rational Concerns

While I have already mentioned (and will mention again below), the academy tends to be a conservative place. We change slowly because the approach we take has worked for a long time.

A very long time.

When I say a long time, some of the works by Aristotle that are studied in Philosophy classes are his lecture notes. I would also note we insist on dressing like it is winter in Europe several hundred years ago — even when commencement is taking place in the summer heat of the American south.

While faculty have complained about prior technological advances (as well as how hot it gets in our robes), large language models are different. Prior advances -- say, the calculator/abacus or spell check -- have focused on automating mechanical parts of a process. While spell check can tell you how to spell something, you have to be able to approximate the word you want for the machine to be able to help you.

ChatGPT not only spells the words. it can provide them

In brief, it threatens to do the thinking portion for its user.

Now, in truth, it is not doing the thinking. It is replicating prior thought by predicting the next word based on what people have written in the past. This threatens to replace the hard part of writing -- the generation of original thought -- with its simulacrum.

Thinking is hard. It's tiring.it requires practice.

Writing is one of the places where it can be practiced.

The disturbing thing about pointing out that this generation of simulacra by students, however, is that too many of our assignments ask students to do exactly that. Take, for example, an English professor who gives their students a list of five research topics to choose from.

Whatever the pedagogical advantages and benefits of such an approach, it is difficult to argue that such an assignment is not asking the student to create such a simulacrum of what they think their professor wants rather than asking them to generate their own thoughts on a topic that they are passionate about.

It is an uncomfortable question to have to answer: How is what I am asking of the students truly beneficial and what is the "value add" that the students receive from completing it instead of asking ChatGPT to complete it?

Irrational Concerns

As I have written about elsewhere, faculty will complain about anything that changes their classroom. The massive adjustments the COVID-19 pandemic forced on the academy produced much walling and gnashing of teeth as we were dragged from the 18th Century into the 21st. Many considered retirement rather than having to learn and adjust.

Likewise, the story of the professor who comes to class with lecture notes, discolored by age and unupdated since their creation, is too grounded in reality to be ignored here. (Full disclosure: I know I have canned responses, too. For each generation of students, the questions are new -- no matter how many times I have answered them before.)

Many of us simply do not wish to change.

Practical Concerns

Learning how to use ChatGPT, and thinking through its implications takes time and resources. Faculty Development (training, for those in other fields -- although it is a little more involved than just training) is often focused in other areas -- the research that advances our reputations, rank, and career.

Asking faculty to divert their attention to ChatGPT when they have an article to finish is a tough sell. It is potentially a counter-productive activity depending on where in your career you are.

140 Characters of Outrage or 140 Character Koans

Andy Ihnatko’s recent comment about the release of the iPhone X is a great example in the potential held within a Tweet. 

IMG_0003.PNG

There is little to disagree with here. Everyone would and should have their soul’s dignity respected and held sacrosanct, whether it is by others or by ourselves. But his statement has an edge to it — one that cut many, myself included. Based on the 126 and counting responses to the Tweet and/or replies, I was not alone.

Yes, I felt lucky that I got an early date on my upgrade to the iPhone X (Being sick, I slept through the 3 AM Eastern launch time.). But I have had to wait for technology upgrades before and did so with a shrug. I considered responding to the tweet in order to make the distinction between the sense of feeling lucky to get an early date and the kind of toxic need to be a first-day adopter. 

Then, I saw the outrage from those who took great umbrage to a statement that safely true.

I won’t get into the details of the responses (You can read them for yourself, if you wish.) as I don’t feel comfortable speaking for others in regard to what they felt and why they felt it. Their outrage, however, triggered a need in me to confront my own response. While I don’t think my relationship with technology is toxic (I can stop any time. Really.), my need to justify myself to someone who does not know me from Adam certainly signals something — something more than acknowledging the false intimacy that social media can sometimes breed. And while my wife’s occasional jokes about my relationship to my iDevices being a little to close to Gollum’s to the One Ring might be overstating it a bit, the cutting edge is still there. 

Ultimately, I do think that the technological revolution offered by these devices is more democratizing and liberating than they are binding. I am able to stay in closer touch with people I know on multiple continents than I would have at any time in the past. Yes, it is true that, in centuries past, there were great letter writers who had such correspondences — ones that were much more intimate than seeing a picture float past on a social media platform, but those people tended to have access to servants to handle things like getting dinner ready. Now, such connections are available to anyone who can afford the technology and the fees required to connect to the internet[1].

No, I am more concerned about our collective reactions to the koanic nature of Twitter and, to a lesser degree, other social media outlets. Koans are difficult and knotty things designed to make one reflect deeply on the nature of self and one’s relationships to the universe. While I don’t think most tweets rise to the level of koans, they do touch a similar nerve in us. It is our reaction to them that is telling.

Ihnatko’s statement is not a judgement of any individual and he has repeatedly said that there is nothing wrong with wanting an iPhone X and taking pleasure in getting one. That did not stop me (and others) trying to justify their feeling of good fortune at getting it sooner rather than later and cause others to insult and demean him. 

I would submit that this tells us a great deal more about those of us responding (Let’s face it: The fact that I am writing this means that I am responding at much more length than those on Twitter.) than it does about Ihnatko or the iPhone X. 

This soul-searching is probably more than a little necessary. The technological revolution we are experiencing will likely be looked back on 500 years from now as a Renaissance that dragged us out of a kind of Dark Age that began sometime around 1914[2], if not before. It is changing the world as significantly as the introduction of movable type to the West in the mid-1400s and it is doing so more quickly. We need, as individuals and as a society, to puzzle out how we will relate to these devices, given how intimate these relationships have become. 


[1] These costs may not be trivial, but they are less expensive than maintaining an aristocratic lifestyle.

[2] Another potential date would be 1789. In cases like this, however, it is best to leave it to the future. They will have a much better perspective on things.


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

The Tyrannies we Falsely Blame on Technology

As this occasional blog may indicate, I am far more technophile than technophobe. Indeed, I prefer to embrace titles like “technophile” and “early-adopter”(^1) to stave off alternate images that would align my behavior with addictions — a case of Star Trek’s holodiction writ small — or as an embracing of wish fulfillment/delusions of grandeur of my being a superhero like Batman, with his voice controlled Batmobile and other wonderful toys, or Tony Stark with his computer assistant Jeeves.

I can stop any time. Really.

My predilection for technology has led me to think about technology and its use in the classroom on more than one occasion. Indeed, a search of iTunes will yield four Summer Institutes(^2), generously funded by the Andrew W. Mellon Foundation, that I organized which focused on Technology and New Media within the academy. Most of my personal focus has been on how the small, incidental uses of technology can improve the life of a faculty member and the experience of students in the classroom rather than on large scale initiatives — how a service like Periscope, for example, can come to your aid when you have to stay home with a sick child as opposed to an analysis of how to roll out online learning campus-wide for all faculty and students. I know people who do the latter (and try to work closely with them) and respect their work.  It’s just that is not where my active interest currently lies.

As with all things, both levels of interest in ed-tech run the risk of losing sight of first causes — the underlying assumptions and needs that drive our decisions. Recently, it took me a surprising amount of effort to trace back to first causes my discomfort with a story that, when I read it, I thought I should be excited by. Union County Public Schools in North Carolina (I am pleased to say my daughter attends a school within this system.) published a piece well worth reading on how Vinson Covington, the AP European History Teacher at Parkwood High School, was getting his students to create a mobile app as a vehicle for learning about history.

Before I go any further, I want to make one thing clear. I think this is a fantastic, inventive idea and that Covington should be applauded for his work and for creating an environment where his students are engaged and are challenged to think about the subject differently. Nothing that follows should be seen as taking away from this, my personal and professional (for what little that is worth) assessment of what he is doing or take away from my hope that I see more teachers doing cross-disciplinary and interdisciplinary work like this at all levels of education. It is absolutely critical for all of our futures.

But as I was writing, I read this article and knew I should be interested an excited by it. Instead, I found myself disquieted. My first response to this disquiet, which I shared on Twitter, was that I would have felt better if it was part of a team-taught course, where the coding and the history could both be more fully explored by the students. And while I still think that, I no longer believe that is the source of my disquiet. Team taught courses are great but, from a staffing point of view, only occasionally practical. The kind of thing that Covington, on his own initiative, is doing here is a solution to a real zero-sum game that administration plays when trying to best deploy the limited manpower available.

Ultimately, I believe the source of my disquiet is the underlying assumptions about which disciplines should make space for others and how that space should be created. Those assumptions are building a hierarchy that many insist does not exist — even as they participate in building and reinforcing the hierarchy. 

In Covington’s case, there is no sense — even in my mind — that it is wrong for history faculty to introduce a coding project into their classroom. Indeed, I remain in awe of what Mike Drout and Mark LeBlanc accomplished and continue to accomplish with their Lexomics Project and know that I need to find the time to use their tool to satisfy some of my own idle curiosities.

To illustrate my concern, consider how non-English and Language Arts faculty react when they decide that their students cannot write well. They turn to the English faculty and ask why they have not taught the students better and look to them to provide solutions. There is no perceived cultural pressure on the non-English faculties to introduce writing into their areas in the way there is to introduce coding into history, as in the case of Covington’s work.

And I hasten to point out that the kind of cultural pressure I am pointing out is not just negative pressure. Covington has been singled out for praise here for his innovation. Can you conceive of an article that praises a member of a Biology faculty for having Pre-Med students write sonnets to improve their writing and interpersonal skills? Can you see that article treating such assignments as anything other than silly? Or as a waste of time that would be better spent on the difficult subject matter material that students are perceived as needing to cover to succeed in medical school?

And yet, no one will deny that understanding how World War I started is easy or unimportant. After all, understanding a highly mechanized, semi-automated system of distributed but telegraphically linked command posts with a series of standing orders that, once begun, cannot be stopped without destroying the system (i.e., the Schlieffen Plan) might be analogous to our contemporary computer-controlled military systems might be what prevents World War III. And learning about sonnets and people’s emotional reactions to them might help a doctor have a better bedside manner or a sufficiently greater sympathy with a patient that lets them notice that, despite the glitter of their walk, their patient may need help. It might help those employed by insurance companies see less of the paperwork of procedure and more the people trapped within the system.

Innovation, then, must not be seen as a one to one correspondence with technology, science, and engineering. Innovation is when we take new ideas and apply them in any field. The unfortunate truth about the way we are currently recognizing innovation in the Academy is that we have tied it too closely to the use of technology — so closely that we can no longer see when innovation is taking place through other areas. This matters not just for humanities faculty who might fear they are becoming second-class citizens within their own disciplines. It also matters to faculty innovators like Brendan Kern, whose podcast on the life of an imagined exoplanet can teach students about biology through the exploration of an alien, new world. Such work is currently more likely to be advertised as “fun” or as a bit of fluff rather than a serious attempt at pedagogical development and innovation that might make material accessible to students.

Whether we in at all levels of the Academy choose to see innovation more broadly than the infusion of STEM and its wonderful toys into other disciplines will determine how likely we are to promote and recognize real innovation across all disciplines. It will require challenging many of our assumptions about how we do things and how much of a king our disciplinary content actually is. It will be difficult for many of us to do this. After all, it is easy to give the appearance of innovation if you see people working on robots or a flying car. It is less easy to do so when you watch someone telling or discussing a story. But both of these represent the skills our students will need to be successful and adaptable in the 21st Century. We must, then, learn how to refuse to be hidebound.

——

 1. For those that are curious, I added the hyphen because of some research conducted by Sparrow Alden, who noticed that, in The Hobbit, J. R. R. Tolkien appeared to hyphenate certain two word phrases to indicate that they stood in for what would have been a single word in Westron (the common language of men and Hobbits in Middle Earth) and come down to us as kennings (https://en.m.wikipedia.org/wiki/Kenning). Although early adopter is not traditionally hyphenated and is not as figurative as oar-steed or whale-road, it is nevertheless true that being called an early-adopter signals more than just being the first kid on the block with a new toy.

2. Or you could follow these links:

 The First JCSU Faculty Summer Institute for Technology & New Media

The Second JCSU Faculty Summer Institute for Technology & New Media

The Third JCSU Faculty Summer Institute for Technology & New Media

The Fourth JCSU Faculty Summer Institute for Technology and New Media and Problem Solving in the Interdisciplinary Humanities


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.