
This essay is the fourth and final in a series about how I used generative AI in a class I taught in the fall term of 2024. Parts 1, 2, and 3 are available here.
This fall, I will teach two classes about how generative AI is changing higher education, one to graduate students in higher education at Penn GSE and one to incoming first-year undergraduates in arts & sciences. I’ll be posting about my plans over the summer and about what we learn throughout the term.
Reflections
The last meeting of any class I teach is spent reflecting on what we learned and what I might do differently the next time I teach. My students this fall said they appreciated the chance to use and talk about AI tools, though one student pointed out that time spent on AI meant less time on history. Fair point, and it made me reflect that I hadn’t done a good enough job of connecting how much the technological change is embedded in the social history we had been reading and writing. That will be part of my introduction to the class next time. New things do not exist apart from the time they first appear. Every new technology develops over time in a social context as people respond to it.
Another bit of feedback was some thought I should do more talking. I have heard this in nearly all of my end-of-term feedback since I shifted to a student-centered classroom. Even if everyone agrees in theory with active learning, and this class was more willing than most to go along with the idea, expectations around teacher behavior don’t change much. By design, in the last half of the term, I‘m on the sideline watching the class perform. That felt weird to me when I first did it, so I understand why students feel this way.
The feedback on JeepyTA, the LLM tool we had used all term, was positive. Students found the first draft review it generated useful, so I talked with them about where else in the process an LLM tool’s depersonalized analysis might add value. Next time, we decided, I will try adding a layer of JeepyTA review to the drafting process focused on critical feedback about the analysis of primary sources. Students have been trained to summarize material objectively and to give encyclopedia-like overviews or reports. Of course, that’s something LLMs now do easily, if unreliably. Analyzing sources using a specific point of view is what I want from my writers, and it is challenging for many of them. Maybe an LLM tool, summarizing engine that it is, can help move students to the more difficult work of bringing their critical perspective on sources, helping them think about an artifact using the framework they are developing.
The most surprising feedback was when I asked whether I should discourage the use of digital tools during class time. This has been a question I have asked since personal computers became the primary tools for note-taking and writing in class. Despite my irritation at seeing eyeballs on screens, I took my students’ point that these are technologies they use to learn. I even use them myself during class, looking up the origins of a word the OED online or a half-remembered something in Wikipedia. The question came out of nostalgia for my own graduate school experience using the nineteenth-century technologies of pen, paper, and books.
This time, when I asked, the response was overwhelmingly positive. I didn’t use José Antonio Bowen’s term “teaching naked”—I didn’t want to freak them out— but I did frame the idea of no screens as an escape from the pressures of the attention economy to an island of human connections and sharing ideas.
The difference this time
Despite having talked about the value of consulting ChatGPT in their educational practice and the fact that most of my students used their laptops to take notes or start drafting sections of their papers, they overwhelmingly endorsed the idea. With some flexibility, they said, they would welcome digital tech-free class meetings.
I don’t think it is difficult to understand the appeal of this proposal. It reflects a shift in our perception of the internet, the role of algorithms in our lives, and the way digitial technology creates problems even as it solves others. The intense feelings about ChatGPT as a homework machine are part of a larger set of anxieties about the social harms of twenty-first-century technologies. The discourse about screen time and the attention economy now informs how students think about the role of digital tools in their own learning.

Like all of us teachers, students have experienced the abrupt change in time spent with others. I think they want those minutes of connectedness back and are looking for ways to make that happen. These changes are driven solely by digital technology. You’ll notice the decline in the chart above shows the overwhelming impact of COVID-19 as well as a milder decline starting around 2015. That decline in 2015 happened when the iPhone and its competitors allowed us to take screens anywhere, but the big drop is associated with the pandemic lockdown and its aftermath.
I have not fully grappled with how the isolation and weirdness of the pandemic changed my own social and cultural practices. I don’t eat or drink out as often these days, despite living in a neighborhood where I can walk to great, inexpensive restaurants. I get Ethiopian or Dim Sum delivered when it’s my turn to pick dinner. I buy cans of local beer from the local bottle shop instead of visiting the awesome breweries of Philadelphia. And it wasn’t just my consumption of food and drink that changed. I stopped reading books during the pandemic, choosing music and computer games instead. My return to in-person teaching last term feels like the early stages of a return to society after a long absence.
We all have stories of coming out from under weird spells cast in 2020. But those stories, like all stories of surviving disasters, are told as if they happened outside time, as if we had a break from history rather than lived through it. The introduction of pocket screens and attention-grabbing algorithms are entangled in a social history of a disease that forced humans to isolate. I do not believe the social effects of the pandemic ended in 2021, nor that social habits around screens are fixed. The enthusiasm my students expressed for screen-free class time is about something more than anxieties about screen time.1
Screen ambivalence
When my kids were younger, we decided that car trips of over an hour were a reason to suspend screen time prohibitions. This means driving to Virginia to visit my brother and his family is far more peaceful than any car trip he and I took as kids. I get car sick if I read, so my choices on long drives were staring out the window or playing car games like the license plate game. When that got boring, I picked on him, or he on me. You're on my side of the seat! Stop touching me! He called me stupid! I doubt we ever took a trip over two hours where my parents didn’t threaten to pull the car over.
I don’t believe I have ever made that threat to my kids. I have to interrupt them to ask if they need me to stop for a bathroom break. This difference between my childhood and the one being experienced by my children is small but suggestive. Is this a welcome relief from the drudgery of long drives, or am I like some parent in the 1920s giving their kids a cigarette?
As they sit in the back seat silently, eyeballs focused on a small electronic device, I think about how often I see the same expressions on people gathered in public places, at a talk or a restaurant. It does seem as though we are amusing ourselves to the death of some part of our social life. I hope the ubiquity of Neil Postman's Amusing Ourselves to Death (1985) complicates the idea that this is a recent question.
The story we tell ourselves goes something like this: The Internet came into being with the promise of connecting humans, but just as that promise was being realized, don’t-be-evil corporations pivoted from building things that delighted us to capturing our eyeballs and milking them for attention. Now, we sit at home in the soft glow of our personal screens, unable to think clearly or summon the will to go outside. ChatGPT is an extension of the brain-eating internet monster that will take over our children’s minds, feeding them short, amusing videos, rendering them unable to see the point of writing or thinking because there is a machine for that now.
That is a scary story, and it makes Postman seem prophetic. Was “and now this” television of the 1980s just a precursor to the algorithmic, screen-based reality we live in today? Maybe. I’m simultaneously drawn to and skeptical of explanations that because the medium is the message, we are all drowning in unreason and consumer culture. The problem is that I read too much about the nineteenth century, so I know that anxieties about mass culture have been a feature of mass culture from the beginning of mass culture.2
In both my parenting and my teaching, I want to avoid sounding like a nineteenth-century schoolmarm lecturing about the moral poison of adventure and romance novels. When I start to slip into that frame, I remind myself that my kids and their friends are just as into culture as I was at their age, and just as weird. Kids these days are more likely to watch a video on YouTube than read an encyclopedia for hours or play a video game all day. So what? I don’t see much evidence of brain rot. It does not seem to me that my kids or my students are less curious and lively than people in the past. The final papers I read in the fall were just as good as those in past terms.
One change I see is that my kids and my students read books less often. Why? That’s a real question. There is more going on here than watching short videos. The interest of my students in limiting digital tools in class seems like an opportunity to think together about what that might be. When I issue the invitation next year to put away our screens for class, I will propose that we read books instead. I will invite students to write using pen on paper. I will ask them how starting with a blank piece of paper feels different from a screen with a blinking cursor.
Next time
This coming fall, I will teach a new class: How AI is Changing Higher Education. Because I am a historian, the questions I will use to frame the class are about change over time, even those the focus will be on the very recent past. As I said, new technology develops over time in a social context as people respond to it. ChatGPT will turn three this November. I’ll need to think about how it figures in the History of American Higher Education when I teach it again next spring. I’ll be posting about my plans for both classes on AI Log teaches.
AI Log, LLC. ©2025 All rights reserved.
I don’t want to minimize the ways Silicon Valley has fouled our internet and swamped our politics, but don’t-be-evil corporations breaking bad do not explain everything that happened over the past decade.
This is a serious question for me, one I plan to explore by reviewing Caleb Smith’s Thoreau’s Axe and Nicholas Carr’s SuperBloom in an upcoming essay.
The danger of constantly staring at a screen is not brain rot but more like lack of variety in thinking and experiences due to all the things we're not doing because screens eat up all of our time.