“Were we required to characterize this age of ours by any single epithet, we should be tempted to call it, not an Heroical, Devotional, Philosophical, or Moral Age, but, above all others, the Mechanical Age. It is the Age of Machinery, in every outward and inward sense of that word; the age which, with its whole undivided might, forwards, teaches and practices the great art of adapting means to ends. Nothing is now done directly, or by hand; all is by rule and calculated contrivance. For the simplest operation, some helps and accompaniments, some cunning abbreviating process is in readiness. Our old modes of exertion are all discredited, and thrown aside. On every hand, the living artisan is driven from his workshop, to make room for a speedier, inanimate one. The shuttle drops from the fingers of the weaver, and falls into iron fingers that ply it faster.”
This is how Scottish historian & writer Thomas Carlyle characterized Great Britain’s mechanized, steam powered industrial era in 1829. These changes in the human relationship to production rippled through the world economy with profound social, political, & environmental implications. One loosely organized group, the Luddites, emerged early on to smash the new machines and resist mechanization of the mills.
200 years after Carlyle’s “Age of Machinery”, we find ourselves sold a new Age, the Age of automation and AI, which promises another transformation in the way we live, work, AND learn, with similar social, political, and environmental consequences. At least, the AI-hype cycle is real. Sal Khan’s new book, for example, Brave New Words: How AI Will Revolutionize Education (and Why That's a Good Thing) promises to be “required reading for everyone who cares about education.”
But what should be the relationship of education, automation & artificial intelligence? Should there be one at all? How much power – not to mention student data – should educators cede to the new machine in the Age of AI?
Or…should the answer be a 21st century Luddite revival and mass resistance to the vision of the future offered by Google, OpenAI, and Microsoft?
Charles Logan is a Learning Sciences PhD Candidate at Northwestern University, writing earlier this year for the Los Angeles Review of Books, “Ultimately, the Luddites’ militancy and commitment to resistance might be a necessary entry point for how laborers—and teachers, students, and caregivers—can take an antagonistic stance toward AI and automation, and create a new ‘commons.’”
1
00:00:00,000 --> 00:00:13,280
2022 was a system reboot. 2023 broke the doom loop. This year is all about turning vision
2
00:00:13,280 --> 00:00:20,840
into reality. Conference to Restore Humanity 2024 is an invitation for K-12 and college
3
00:00:20,840 --> 00:00:26,900
educators to build your joyful reimagined classroom. Our conference is designed around
4
00:00:26,900 --> 00:00:32,260
the accessibility and sustainability of virtual learning, while engaging participants in an
5
00:00:32,260 --> 00:00:37,860
environment that models the same progressive pedagogy we value with students. Instead of
6
00:00:37,860 --> 00:00:43,700
long Zoom presentations with a brief Q&A, our flipped keynotes let the learning community
7
00:00:43,700 --> 00:00:50,700
listen and learn on their own time, then engage in a one-hour Q&A with our speakers. Dr. Mary
8
00:00:50,700 --> 00:00:56,560
Helen Imordino-Yang makes the neurobiological case for progressive education rooted in her
9
00:00:56,560 --> 00:01:03,040
groundbreaking work in affective neuroscience. Dr. Carla Shalaby demonstrates the power of
10
00:01:03,040 --> 00:01:09,880
education as the practice of freedom, honoring young people's rights to full human being. Dr.
11
00:01:09,880 --> 00:01:16,360
Sousan Jaber elevates the voices of Arab and Muslim students as an advocate for global equity
12
00:01:16,360 --> 00:01:22,560
and justice. And Orchard View Schools' Innovative Learning Center showcases healthy, sustainable
13
00:01:22,560 --> 00:01:27,880
community learning spaces for teenagers and adult learners alike. Beyond our flipped keynotes,
14
00:01:27,880 --> 00:01:34,320
participants will be invited to join week-long learning journeys. Join Trevor Alio on a journey
15
00:01:34,320 --> 00:01:39,500
to learn interdisciplinary inquiry-based methods to equip students as knowledge producers,
16
00:01:39,500 --> 00:01:45,920
communicating with zines, podcasts, and more. And understand the ripple effects of modern
17
00:01:45,920 --> 00:01:52,200
imperialism with a focus on Palestinian resilience and classroom tools for fostering global
18
00:01:52,200 --> 00:01:58,560
solidarity in our second workshop led by Abir Ramadan Shinawi. We're also featuring virtual
19
00:01:58,560 --> 00:02:04,840
school tours so you can see progressive practice in action at the Nova Lab, Olin Tangi STEM Academy,
20
00:02:04,840 --> 00:02:11,600
Community Lab School, and more. The mission of reshaping education systems, of turning vision
21
00:02:11,600 --> 00:02:18,560
into reality, is vital for a sustainable and just future. Conference to Restore Humanity runs July
22
00:02:18,560 --> 00:02:25,600
22nd through the 25th, and as of recording, early bird tickets are still available. It's $150 for
23
00:02:25,600 --> 00:02:31,680
four days, with discounts available, group rates, and parity pricing. Plus, we'll award certificates
24
00:02:31,680 --> 00:02:37,920
for teacher training and continuing education credits. See our website, humanrestorationproject.org,
25
00:02:37,920 --> 00:02:41,600
for more information, and let's restore humanity together.
26
00:02:48,560 --> 00:03:04,480
Hello and welcome to Episode 150 of the Human Restoration Project Podcast. My name is Nick
27
00:03:04,480 --> 00:03:09,760
Covington. Before we get started, I wanted to let you know that this episode is brought to you by
28
00:03:09,760 --> 00:03:15,560
our supporters, three of whom are Kimberly Baker, Simeon Frank, and Corinne Greenblatt. You can
29
00:03:15,560 --> 00:03:20,120
learn more about Human Restoration Project on our website, humanrestorationproject.org,
30
00:03:20,120 --> 00:03:31,600
and connect with us anywhere on social media. Were we required to characterize this age of
31
00:03:31,600 --> 00:03:38,280
ours by any single epithet, we should be tempted to call it not a heroic, devotional, philosophical,
32
00:03:38,280 --> 00:03:45,320
or moral age, but above all others, the mechanical age. It is the age of machinery in every outward
33
00:03:45,320 --> 00:03:51,720
and inward sense of that word, the age which, with its whole undivided might, forwards, teaches,
34
00:03:51,720 --> 00:03:58,760
and practices the great art of adapting means to ends. Nothing is now done directly or by hand.
35
00:03:58,760 --> 00:04:05,480
All is by rule and calculated contrivance. For the simplest operation, some helps and accompaniments,
36
00:04:05,480 --> 00:04:12,040
some cunning abbreviating processes and readiness. Our old modes of exertion are all discredited and
37
00:04:12,040 --> 00:04:17,960
thrown aside. On every hand, the living artisan is driven from his workshop to make room for a
38
00:04:17,960 --> 00:04:24,040
speedier, inanimate one. The shuttle drops from the fingers of the weaver and falls into iron
39
00:04:24,040 --> 00:04:31,480
fingers that ply it faster. This is how Scottish historian and writer Thomas Carlyle characterized
40
00:04:31,480 --> 00:04:38,520
Great Britain's mechanized, steam-powered industrial era in 1829. These changes in
41
00:04:38,520 --> 00:04:43,400
the human relationship to production rippled through the world economy with profound social,
42
00:04:43,400 --> 00:04:49,320
political, and environmental implications. One loosely organized group, the Luddites,
43
00:04:49,320 --> 00:04:54,840
emerged early on to smash the new machines and resist mechanization of the mills.
44
00:04:55,800 --> 00:05:03,640
200 years after Carlyle's age of machinery, we find ourselves sold a new age, the age of automation
45
00:05:03,640 --> 00:05:10,600
and AI, which promises another transformation in the way we live, work, and learn with similar
46
00:05:10,600 --> 00:05:16,520
social, political, and environmental consequences. At least, the AI hype cycle is real.
47
00:05:17,240 --> 00:05:23,720
Sal Khan's new book, for example, Brave New Words, How AI Will Revolutionize Education and Why That's
48
00:05:23,720 --> 00:05:29,480
a Good Thing, promises to be, quote, required reading for everyone who cares about education,
49
00:05:29,640 --> 00:05:36,440
end quote. But what should be the relationship of education, automation, and artificial intelligence?
50
00:05:36,440 --> 00:05:42,920
Should there be one at all? How much power, not to mention student data, should educators seed
51
00:05:42,920 --> 00:05:49,880
to the new machine in the age of AI? Or should the answer be a 21st century Luddite revival
52
00:05:49,880 --> 00:05:55,320
and mass resistance to the vision of the future offered by Google, OpenAI, and Microsoft?
53
00:05:56,280 --> 00:06:02,200
That, anyway, I suspect, will be the argument of my guest today, Charles Logan, a learning sciences
54
00:06:02,200 --> 00:06:08,040
PhD candidate at Northwestern University. Writing earlier this year for the Los Angeles Review of
55
00:06:08,040 --> 00:06:14,440
Books, ultimately, the Luddites' militancy and commitment to resistance might be a necessary
56
00:06:14,440 --> 00:06:20,520
entry point for how laborers and teachers, students, and caregivers can take an antagonistic
57
00:06:20,520 --> 00:06:27,560
stance towards AI and automation and create a new commons. Thank you so much, Charles,
58
00:06:27,560 --> 00:06:31,240
for joining me today. Thanks for having me. I'm excited to be chatting with you all today.
59
00:06:31,800 --> 00:06:38,600
So I think in the popular imagination, a Luddite is like an octogenarian congressperson who doesn't
60
00:06:38,600 --> 00:06:45,000
use email or someone who's entirely technophobic, perhaps even culturally conservative, right?
61
00:06:45,080 --> 00:06:51,160
Preserving the old ways against progress. One of the earliest pieces that you shared with me is
62
00:06:51,160 --> 00:06:57,960
from hybrid pedagogy, and it was written in 2014, fully a decade ago, right? And by an author going
63
00:06:57,960 --> 00:07:03,720
by torn halves called Toward a Luddite Pedagogy. And I was reading back through that and saw that
64
00:07:03,720 --> 00:07:11,000
it makes no mention of AI. So it's clearly an idea that existed alongside tech for at least
65
00:07:11,000 --> 00:07:17,960
that long. So help us understand who the Luddites were. What were they about? And what can we learn
66
00:07:17,960 --> 00:07:21,320
from how they approached technological change in their own time?
67
00:07:22,440 --> 00:07:28,120
Sure thing. Yeah, I will do my best to situate the Luddites in history. And working on a piece
68
00:07:28,120 --> 00:07:33,480
right now with Phil Nichols and Terrell Garcia that tries to track what we've come up with three
69
00:07:33,480 --> 00:07:39,480
different waves of Luddites. But I think it's important to start with the OGs, the original
70
00:07:40,120 --> 00:07:46,280
Luddites. So as you mentioned, it's the early 19th century in England, and you have the long
71
00:07:46,280 --> 00:07:53,000
history of different sorts of cloth workers who all of a sudden are faced with automation. And so
72
00:07:53,000 --> 00:07:59,720
you have early industrial capitalists who are building factories, who are shifting the nature
73
00:07:59,720 --> 00:08:08,040
of labor because of these technologies that are displacing people, are allowing capitalists to
74
00:08:08,040 --> 00:08:13,960
depress wages, and that are having these rippling effects on workers themselves, but also their
75
00:08:13,960 --> 00:08:20,920
communities. And so these automating technologies are threats to both the livelihood of these
76
00:08:20,920 --> 00:08:26,120
laborers, but also the dignity in which they conceive of themselves and their work and the
77
00:08:26,120 --> 00:08:32,280
great traditions that have brought them to this place in history. And so the Luddites are actually
78
00:08:32,280 --> 00:08:41,640
named after a fictional character, Ned Ludd, who, the story goes, was punished by one of his bosses,
79
00:08:41,640 --> 00:08:49,240
essentially, and then decided to destroy the automating machine, Spinning Jenny, or one other
80
00:08:49,240 --> 00:08:57,880
kind of mill. And so emerges as this folk hero, again, in a part of the world where it were not
81
00:08:57,880 --> 00:09:02,360
strangers to folk heroes like Robin Hood. And then you have, as you mentioned, this loose,
82
00:09:03,000 --> 00:09:10,360
but also connected set of political projects that go under the name of the Luddites across England
83
00:09:10,360 --> 00:09:18,040
in the early 19th century. And you have folks who are strategically under the cover of darkness,
84
00:09:18,040 --> 00:09:26,040
breaking into factories, breaking into factory owners' homes, and destroying what they call
85
00:09:26,040 --> 00:09:31,080
the obnoxious machines, the machines that were threats to their livelihood, to their dignity.
86
00:09:31,080 --> 00:09:37,720
And so again, I will emphasize it is strategic sabotage. It is not haphazard. And so as you
87
00:09:37,720 --> 00:09:44,600
mentioned, over the last 200 plus years, Luddite has become a pejorative. I think I was looking
88
00:09:44,600 --> 00:09:50,120
at what Dictionary.com has those words of the day and a corresponding image. And the corresponding
89
00:09:50,120 --> 00:09:56,280
image was, as you said, an octogenarian looking at her phone as if it were a fish.
90
00:09:56,280 --> 00:10:03,000
And so we've seen these sort of projects then to rehabilitate and sort of reframe the Luddites
91
00:10:03,000 --> 00:10:10,600
over the last 250 plus years. And so that happened with historians in the early to mid 20th century.
92
00:10:10,600 --> 00:10:16,440
And then you have moving into the 1970s, what Phil and Antero and I have called the second wave
93
00:10:16,440 --> 00:10:25,480
of Luddism. And that is a different group of people. You have folks like Kirkpatrick Sale is
94
00:10:25,480 --> 00:10:32,040
one of the leaders of these new Luddites who are questioning technology at a time of the Cold War
95
00:10:32,840 --> 00:10:39,160
and sort of the growth of nuclear energies and weapons and environmentalists and pacifists,
96
00:10:39,160 --> 00:10:44,360
Quakers. And so there's more of a sort of this motley crew and this big tent,
97
00:10:44,360 --> 00:10:51,240
but unlike their predecessors were often avowed pacifists, as I mentioned. And so they're moving
98
00:10:51,240 --> 00:10:58,120
away from that original tactic of physical destruction of infrastructure and more of a
99
00:10:58,120 --> 00:11:05,480
kind of set of beliefs to organize around. And then the term kind of falls out of favor again.
100
00:11:06,440 --> 00:11:12,600
And it's only been, I would say, in the last sort of, well, you mentioned Torne Havs piece from 2014
101
00:11:12,600 --> 00:11:17,800
and Audrey Waters picks up on that work from a few years ago. And then you also have this sort
102
00:11:17,800 --> 00:11:24,120
of growth in a lot of tech critics like Brian Merchant and Paris Marx and others who are
103
00:11:24,120 --> 00:11:32,840
reclaiming the name and the identity of a Luddite. It is a very fluid term. I think that is something
104
00:11:32,840 --> 00:11:38,200
to acknowledge that it has changed over the course of time. Now, I think, as you mentioned,
105
00:11:38,920 --> 00:11:45,160
as we have the Solcons of the world who are doing their best to infuse our classrooms with
106
00:11:45,160 --> 00:11:49,880
their proprietary chatbots, I think we can look to the Luddites again and think about, well,
107
00:11:49,880 --> 00:11:55,560
what would it mean to practice this Luddite praxis and what kinds of interventions, what kind of
108
00:11:55,560 --> 00:12:02,200
tactics, what kind of sabotage, what kind of organizing can be done and inspired by the Luddites
109
00:12:02,760 --> 00:12:09,240
in the year 2024? Well, let's go ahead and bring in, since you mentioned Solcons,
110
00:12:09,240 --> 00:12:14,120
since we're kind of transitioning into talking about what would be considered the third wave
111
00:12:14,120 --> 00:12:20,280
in your take on third wave Luddism here. So you've been skeptical of announcements from
112
00:12:20,280 --> 00:12:26,280
OpenAI's Sam Altman, Microsoft's Bill Gates, and kind of working in tandem here. And you've been
113
00:12:26,280 --> 00:12:31,080
sharing your criticism of Solcon's newest book on social media, which I've enjoyed reading there,
114
00:12:31,080 --> 00:12:36,840
too. So what even is that transformation that they're promising? What tactics of the new
115
00:12:37,640 --> 00:12:43,480
modern industrial capitalists are the third wave Luddites reacting to? Can you outline that for
116
00:12:43,480 --> 00:12:49,720
us? Yeah, I mean, a lot of it comes down to the same dog and pony show that Solcon has been selling
117
00:12:49,720 --> 00:12:55,000
for a long time, and that is personalized learning. And he has this real sort of superpower to
118
00:12:55,000 --> 00:13:00,360
forget any sort of history, maybe never learned it in the first place, has been strategic about
119
00:13:00,360 --> 00:13:06,680
his own ignorance. And again, as an avowed Audrey Waters fanboy, she's written about the
120
00:13:07,400 --> 00:13:12,120
history of personalized learning and begins her book, which is, I believe the title is the
121
00:13:12,120 --> 00:13:16,440
history of personalized learning, or teaching machines, the history of personalized learning,
122
00:13:16,440 --> 00:13:23,080
MIT Press, go buy it now. And she starts with an anecdote about Solcon and how he sort of makes
123
00:13:23,080 --> 00:13:29,160
this entree into education after having been a hedge fund analyst, which, again, I don't want
124
00:13:29,160 --> 00:13:34,440
to discount the possibility that someone could move from the business world into education and
125
00:13:36,120 --> 00:13:42,120
bring with them and develop a critical pedagogy. I'd argue Solcon is not that person.
126
00:13:42,120 --> 00:13:48,120
And so it is, again, I think the story of personalized learning of we can all of a sudden
127
00:13:49,480 --> 00:13:54,040
flip on this chat bot or open up the chat bot. And here you have a personalized
128
00:13:54,040 --> 00:14:01,640
tutor who is capable of getting to know you, providing real time feedback, and all of a sudden
129
00:14:02,600 --> 00:14:12,440
you can achieve greatness. Reading the book, as painful as it is, I think that sense of automation
130
00:14:12,440 --> 00:14:18,600
of what it's like to be a student, what it's like to be a teacher, I think has the echo of early
131
00:14:18,600 --> 00:14:22,760
industrialists who are pushing similar sorts of automating technologies. And again, I mean,
132
00:14:22,760 --> 00:14:27,880
you see who's blurted the book. You mentioned Bill Gates, the current CEO of Microsoft,
133
00:14:27,880 --> 00:14:33,960
blurted the book. Wendy Kopp blurted the book, who's the founder of Teach for America and his
134
00:14:33,960 --> 00:14:42,520
very problematic approach to sustainable teaching in our most vulnerable and undersourced communities.
135
00:14:42,520 --> 00:14:47,560
And again, technology being turned to as a kind of panacea, which again, there's a precedent for
136
00:14:47,560 --> 00:14:52,920
that. And that's B.F. Skinner and his teaching machines who sought to bring these teaching
137
00:14:52,920 --> 00:15:00,440
machines into Harlem and other under-resourced communities, as again, answers for complex social
138
00:15:00,440 --> 00:15:06,280
problems that date back decades, if not centuries. And so this technosolutionism, as again, the kind
139
00:15:06,280 --> 00:15:12,920
of panacea that Khan is pushing in conjunction with folks like, he refers to Sam Altman as Sam,
140
00:15:12,920 --> 00:15:17,320
and he went to go see Bill Gates right after he saw Sam. And so there's this very sort of
141
00:15:17,320 --> 00:15:22,520
chummy group. And so reading this and thinking about Audrey Waters' work and others who have
142
00:15:22,520 --> 00:15:29,080
been very influential and thinking about, well, this turn to Luddism, what does that offer us?
143
00:15:29,080 --> 00:15:33,800
Because I think it's important to acknowledge that, and this is something that I've wrestled
144
00:15:33,800 --> 00:15:40,040
with and thinking about, well, what are potential shortcomings of Luddism, is you can't easily smash
145
00:15:40,040 --> 00:15:49,160
a chat bot. You could smash the Chromebook that you're using and then have to turn around and
146
00:15:49,160 --> 00:15:55,480
probably pay your school. And so there's a real kind of limitation to what Luddism looks like
147
00:15:55,480 --> 00:16:03,320
in the 21st century that I don't think enough folks who have claimed that as an identity
148
00:16:03,320 --> 00:16:11,320
acknowledge. And yet, I think that that spirit of sabotage of a more militaristic
149
00:16:11,320 --> 00:16:18,280
stance towards the technology, a more intentionally aggressive stance is helpful because there are,
150
00:16:18,280 --> 00:16:26,200
right, these emerging technologies. Nightshade, I think, is one of them where you can upload images
151
00:16:26,200 --> 00:16:32,520
that essentially will poison a data set. And so there are these tools that exist that, again,
152
00:16:32,520 --> 00:16:38,040
I would argue are Luddite tools. Their makers may not claim them as Luddite, but I think we
153
00:16:38,040 --> 00:16:45,240
can view them through that lens. And then I also think, right, that skepticism, that organizing
154
00:16:45,240 --> 00:16:52,360
was, I think, an important piece of the Luddites. The original Luddites are organizing under the
155
00:16:52,360 --> 00:17:00,920
cover of darkness and these sort of backroom taverns and trying to figure out, what do we
156
00:17:01,000 --> 00:17:07,640
do about this technology? How do we counter it? And so I think there's a lesson there for teachers,
157
00:17:07,640 --> 00:17:14,040
for students who are all of a sudden facing these technologies and being told that this is the
158
00:17:14,040 --> 00:17:21,880
future of education and that organizing is an important piece to pull out from the Luddites,
159
00:17:21,880 --> 00:17:28,920
as well as those tactics of sabotage, as well as the playfulness of Luddites. So, you know,
160
00:17:28,920 --> 00:17:35,880
they're writing poems, they are inspiring poems, but Lord Byron is, I think, the most famous poet
161
00:17:35,880 --> 00:17:41,160
who kind of came to the Luddite cause. They're sending these missives to various political
162
00:17:41,160 --> 00:17:47,800
leaders and nailing these threatening notes to doors. They're dressing in drag. And so I think
163
00:17:47,800 --> 00:17:53,720
there's this sense of deadly seriousness alongside that playfulness. And I think there's a lot to be
164
00:17:53,720 --> 00:18:01,320
said about that playfulness and as sort of like a public pedagogy that we see in education. And
165
00:18:01,320 --> 00:18:07,080
we see, I think, also, you know, folks who I'm thinking of like Emily Bender and Alex Hannah
166
00:18:07,080 --> 00:18:15,720
have a wonderful podcast. It's like a live stream. It's called Mystery AI Hype Theater 3000,
167
00:18:15,720 --> 00:18:22,280
something like that. And they basically just like, I don't know if I can say this on the show,
168
00:18:22,280 --> 00:18:29,480
shit talk, these latest AI hype, like an article or whatever it might be and across different
169
00:18:29,480 --> 00:18:35,080
disciplines, including education. And I think it's a profoundly like, educated act to like
170
00:18:35,080 --> 00:18:40,360
have this show that they then put out and, you know, in a podcast and a newsletter, you know,
171
00:18:40,360 --> 00:18:45,800
the video and it's hosted, not on YouTube, like, you know, you're not putting money in
172
00:18:45,800 --> 00:18:50,920
Alphabet's pocket. And so, again, I think there are those tactics of sabotage, of organizing,
173
00:18:50,920 --> 00:18:56,360
of playfulness, and also acknowledging that even with the historical Luddites, the Luddites and
174
00:18:56,360 --> 00:19:01,640
their kind of projects were connected, but they were also hyper local to the different regions.
175
00:19:01,640 --> 00:19:07,320
And so acknowledging that like what we as educators are doing with our students is always
176
00:19:07,320 --> 00:19:12,760
contingent on context. It's always contingent on the people in the room and the constraints that
177
00:19:12,760 --> 00:19:18,440
we're facing, given who we are and where we are. And so I think that's an important piece to uplift
178
00:19:18,440 --> 00:19:25,000
from the original Luddites as well, is that a lot of these different projects, while loosely
179
00:19:25,000 --> 00:19:31,640
connected around, you know, sabotage and organizing to protect their livelihood and dignity against
180
00:19:31,640 --> 00:19:37,480
the incursion of these automating technologies that do not have, you know, uniform sets of
181
00:19:37,480 --> 00:19:42,280
beliefs or uniform goals. And I think that's something that we can also hold on to as we
182
00:19:42,280 --> 00:19:49,240
think about our own kind of like heterogeneity in 2024 and thinking about what would it look like
183
00:19:49,800 --> 00:19:57,320
to, you know, practice a Luddite praxis in response to, you know, this age of generative
184
00:19:57,320 --> 00:20:02,040
AI that we find ourselves in. And I don't even like calling it the age of generative AI.
185
00:20:03,320 --> 00:20:08,520
I feel like puts it up on a pedestal that it should not be. We don't need to see that ground
186
00:20:08,520 --> 00:20:17,240
to them. You know, it's this flashpoint of this sizzle of AI in education that will more than
187
00:20:17,240 --> 00:20:23,400
likely burn out, hopefully sooner rather than later. And I think that's a great way to get into
188
00:20:23,400 --> 00:20:31,080
the obfuscation that the AI hype cycle, it's the age of AI hype, if it's not the age of generative
189
00:20:31,080 --> 00:20:36,440
AI, right? It's the age of the AI salesman or, you know, lump cryptocurrencies somewhere in there,
190
00:20:36,440 --> 00:20:43,000
too. But, you know, as I'm looking at the blurbs for Sal Khan's book, right, from Arne Duncan,
191
00:20:43,000 --> 00:20:48,280
Tony Blair, you mentioned Wendy Kopp, formerly of Teach for America, now Teach for All,
192
00:20:49,000 --> 00:20:54,520
again, Sam Altman, Adam Grant, Bill Gates, right, all of these people. And in their blurbs,
193
00:20:54,520 --> 00:20:58,680
obfuscating something that's a little bit deeper. And I think it's something that you really had
194
00:20:58,680 --> 00:21:04,760
elaborated on in some work that you shared with me about understanding an ecological framework
195
00:21:04,760 --> 00:21:10,680
that actually helps us see past that hype cycle and understand the relationships between generative
196
00:21:10,680 --> 00:21:16,200
AI and how it's entwined with so many other aspects of life. And I think as far as a third
197
00:21:16,200 --> 00:21:22,120
wave Luddism go, I think those ecologies are really important to recruiting people to see past,
198
00:21:22,120 --> 00:21:26,920
you know, that obfuscation and that veil of, hey, here's how it's going to transform education. But,
199
00:21:26,920 --> 00:21:32,200
hey, how is that going to impact the environment or human labor or any of those other features?
200
00:21:32,200 --> 00:21:36,920
Could you explain what you mean by that ecological framework and why that matters?
201
00:21:38,280 --> 00:21:44,600
Yeah, I'd be happy to. And so let me start by contrasting it, I think, with the more common
202
00:21:44,600 --> 00:21:52,680
framework that I've encountered when working with young people and teachers about AI and learning
203
00:21:52,680 --> 00:21:58,440
with AI, and that's AI literacy. And so I think, right, AI literacy, one of the critiques of
204
00:21:58,440 --> 00:22:05,480
literacy as a framework that I'm familiar with is that literacy tends to focus on representative
205
00:22:05,480 --> 00:22:11,880
forms. And so I think that's where you can see, you know, the let's develop an AI literacy curriculum
206
00:22:12,520 --> 00:22:17,000
that is going to be based on the assumption that these technologies should be used,
207
00:22:17,000 --> 00:22:23,320
but let's be careful about how we use them because the technology, they're bullshit machines,
208
00:22:23,320 --> 00:22:28,360
and they're just, you know, mathy math that is going to spit out like predictive text,
209
00:22:28,360 --> 00:22:32,520
and that text is biased. And so let's attend to the bias and let's be careful
210
00:22:33,080 --> 00:22:40,120
not to just take whatever comes out of a chat bot as the truth. So there's like that conversation,
211
00:22:40,120 --> 00:22:43,800
obviously, there's the whole plagiarism conversation, and those are important
212
00:22:43,800 --> 00:22:50,600
conversations to have, but I don't think they do the work that we need to really understand
213
00:22:50,600 --> 00:22:57,880
how these technologies are operating, and then also don't open up enough space for interrogating
214
00:22:57,880 --> 00:23:05,720
and intervening in the technologies. So I'm all for an AI literacy that is expansive in its approach
215
00:23:05,720 --> 00:23:12,120
that pushes back against the assumption that AI should be used in schools. I'll also note that,
216
00:23:12,120 --> 00:23:17,160
and this is work that Phil Nichols has done and others around what they call the capture of
217
00:23:17,720 --> 00:23:23,560
AI literacy and the notion that literacy as a project is a speculative one of sort of thinking
218
00:23:23,560 --> 00:23:29,080
about what the world could be and in a hopeful place. And so that's when, you know, literacy
219
00:23:29,080 --> 00:23:36,600
is at its best, but that big tech companies are able to use these literacy frameworks as a means
220
00:23:36,600 --> 00:23:42,040
of what Lucy Pongrather calls our soft power of governance and framing the technologies as
221
00:23:42,040 --> 00:23:47,080
necessary, again, with some caveats of like, we know they're biased, and we know they're potentially
222
00:23:47,080 --> 00:23:52,360
discriminatory. But you know, that's for you all to figure out. And here's this curriculum you can
223
00:23:52,360 --> 00:23:58,600
use to figure it out. So that, you know, AI literacy, I've got a complicated relationship
224
00:23:58,600 --> 00:24:03,560
with. The ecological framework, I think, is something that I find more compelling in part
225
00:24:03,560 --> 00:24:10,680
because it moves beyond just, you know, the interface of a platform on your device. And it
226
00:24:10,680 --> 00:24:17,640
takes into account these, I just mentioned these obfuscated systems of labor. So you have, right,
227
00:24:17,640 --> 00:24:23,960
folks often in the Global South doing, you know, ghost work is often referred to the notion of
228
00:24:23,960 --> 00:24:30,680
unseen labor, where they're training the data that makes AI, AI, and right, AI is people all
229
00:24:30,680 --> 00:24:36,680
the way down. So they're doing this really gruesome work of labeling toxic material,
230
00:24:36,680 --> 00:24:44,040
traumatizing material. And so I think that's one dimension of the ecological framework that's
231
00:24:44,040 --> 00:24:49,080
important to note, labor. I've said it's people all the way down, AI is, but it's all the way down
232
00:24:49,080 --> 00:24:56,760
into the very, you know, waters and earth and soil and water that make up our planet. And, you know,
233
00:24:56,760 --> 00:25:04,840
from the mining that goes into producing the computer chips, the materials for the data centers
234
00:25:04,840 --> 00:25:10,440
in which, you know, the infrastructure of AI is rapidly, you know, being spread throughout the
235
00:25:10,440 --> 00:25:16,280
world. And again, the amount of water required to cool data centers, to cool the servers that make
236
00:25:16,280 --> 00:25:22,440
not just AI, but the internet possible. So there's like a real material aspect to the technology that
237
00:25:22,440 --> 00:25:27,080
I think is important for students to interrogate, especially because, right, these are the young
238
00:25:27,080 --> 00:25:33,480
people who are going to bear the brunt of climate change. And so I find it increasingly difficult
239
00:25:33,480 --> 00:25:40,440
to square the narrative that technology, AI technology specifically, is going to revolutionize
240
00:25:40,440 --> 00:25:46,760
education at a time when that same technology is increasing the effects of climate change,
241
00:25:46,760 --> 00:25:51,960
is further entrenching the power of fossil fuel companies. There's a headline today about the US
242
00:25:52,680 --> 00:25:59,080
continuing coal production because we need, you know, for energy consumption for AI production.
243
00:25:59,080 --> 00:26:05,960
So it is this, I think, hypocrisy and a real set of compromises that I think folks need to
244
00:26:05,960 --> 00:26:09,560
confront. And this is what I try to do with my own students, is like, if you're going to use these
245
00:26:09,560 --> 00:26:15,800
technologies, there are real material harms that they're doing to people, to planet, to humans,
246
00:26:15,800 --> 00:26:19,400
to more than humans. And, you know, you kind of alluded to this, but we also see, right,
247
00:26:19,400 --> 00:26:24,840
the ideological and financial benefits that go to people like Bill Gates and Sam Altman.
248
00:26:24,920 --> 00:26:31,240
There's been writing about what's called the test realistic bundle from people like Emil Torres and
249
00:26:31,240 --> 00:26:37,400
to Nick Garou and this sort of set of ideologies essentially is like eugenics and like the notion
250
00:26:37,400 --> 00:26:44,680
of like AI as some sort of superhuman, you know, project that is going to, I'm doing this a very
251
00:26:44,680 --> 00:26:50,440
slapdash version of the test realistic bundle. We'll add some notes in the show notes, I suppose.
252
00:26:51,080 --> 00:26:56,760
In short, I think what an ecological framework offers is a more expansive set of questions,
253
00:26:56,760 --> 00:27:03,160
a more expansive set of sort of lines of inquiry than these kind of traditional corporate AI
254
00:27:03,160 --> 00:27:09,720
literacy curricula offer. Yeah, I was just going back and reading your tweet in response to
255
00:27:09,720 --> 00:27:15,800
the plan to, or to unretire coal fired plants as power demand from AI surges. And you wrote,
256
00:27:15,800 --> 00:27:19,880
AI is supposedly revolutionizing the future of education while making the actual future
257
00:27:19,880 --> 00:27:25,000
increasingly bleak. I don't know if there's like a sci-fi analogy or a monkey's paw or something
258
00:27:25,000 --> 00:27:30,200
like that, but it's literally as, you know, we're revolutionizing one space and immiserating the
259
00:27:30,200 --> 00:27:34,680
future in another one. It's just like there's a direct cost related to all of those things.
260
00:27:35,240 --> 00:27:42,600
I think perhaps like a criticism that I've seen of this Luddism when I had kind of posted a preview
261
00:27:42,600 --> 00:27:48,920
of our conversation that came up was about its potential to help with assistive technologies,
262
00:27:49,560 --> 00:27:56,360
the possibility to be a huge boon for disabled people or to even be assistive in the sense of
263
00:27:56,360 --> 00:28:01,720
providing accurate translation services or transcription services. And I think I'll mix
264
00:28:01,720 --> 00:28:06,200
my literary analogies a little bit more. My brain is kind of thinking in the sci-fi space, but
265
00:28:06,200 --> 00:28:11,400
I went to almost kind of like this William Gibson Neuromancer cyberpunk space in the sense where
266
00:28:12,040 --> 00:28:18,360
you have this overarching corporate structure and evil technology sort of predominates,
267
00:28:18,360 --> 00:28:24,520
but you have these niches of DIY mutual aid and the like kind of coexisting alongside that.
268
00:28:24,520 --> 00:28:31,720
That's invisible to the mainstream. I suppose that's not ideal. Or is it like the one ring?
269
00:28:31,720 --> 00:28:37,480
Should AI be buried deep in a mountain or thrown into Mount Doom? Because simply interacting with
270
00:28:37,480 --> 00:28:44,040
it has such a profoundly negative and transformative consequence for users and the environment as a
271
00:28:44,040 --> 00:28:50,280
whole, as you said, like embedding these ideologies at the cost of this invisible human labor. And
272
00:28:50,840 --> 00:28:56,920
it's a miseration of the environment. So I wonder like, I just don't know how, this is the question
273
00:28:56,920 --> 00:29:02,440
that I grapple with, right? Like how do we, oh gosh, I don't want to say take the good, but
274
00:29:02,440 --> 00:29:09,000
you know, like use it for where it could be like for more humane, pro-human purposes, or is it
275
00:29:09,000 --> 00:29:13,240
radioactive in the sense that it's better off just not to touch it and develop some other
276
00:29:13,240 --> 00:29:16,600
technologies? I don't know how to resolve this. I'm hoping you'll help me out.
277
00:29:18,600 --> 00:29:19,160
What do we do?
278
00:29:19,160 --> 00:29:23,240
I came here for you to figure out my problems, not for me to figure out your problems.
279
00:29:23,240 --> 00:29:27,080
Yeah. So I mean, again, I would think, you know, what is the, what are the historical
280
00:29:27,080 --> 00:29:31,560
Luddites from, you know, the early 19th century tell us? And one of the things I get to emphasize
281
00:29:31,560 --> 00:29:36,600
is that they're not, they're not anti-technology, but they're anti the concentration of power
282
00:29:36,600 --> 00:29:42,280
into the hands of a very few, you know, white men and corporations. And flash forward, we look at,
283
00:29:43,240 --> 00:29:51,240
you know, one example that I think can provide a kind of blueprint for assistive technologies
284
00:29:51,240 --> 00:29:59,960
that you're talking about is there's an organization, Tehuku Media, and they are based in New Zealand
285
00:29:59,960 --> 00:30:06,280
and they, which is the, the Westernized name of the indigenous lands. And they produce
286
00:30:06,920 --> 00:30:13,960
Maori chatbots and other AI for like language acquisition and language revitalization.
287
00:30:13,960 --> 00:30:20,280
And I just saw one of their, like the chief information or chief technology officer came
288
00:30:20,280 --> 00:30:24,440
and spoke at the university recently. And there's a great episode featuring him on the podcast,
289
00:30:25,000 --> 00:30:32,200
Tech Won't Save Us. And so, you know, a few attributes of this company and the technology
290
00:30:32,200 --> 00:30:39,240
that stand in stark contrast to the open AIs of the world is that they take data sovereignty
291
00:30:39,240 --> 00:30:47,240
very seriously. And so they have these from like cassette tapes to like CDs to digitized
292
00:30:47,240 --> 00:30:53,640
audio of several generations of Maori people speaking and telling stories. And they've
293
00:30:53,640 --> 00:30:58,920
approached these people and received their consent to use these stories, to train these
294
00:30:58,920 --> 00:31:04,120
models to revitalize and maintain their language and to teach people, Maori people, their language.
295
00:31:04,120 --> 00:31:13,400
And so that to me is one example of care for people, for their stories, for their data.
296
00:31:13,400 --> 00:31:19,640
It is not an attempt to scale. I think that notion of scale is so problematic in so many ways,
297
00:31:19,640 --> 00:31:28,360
especially when it comes to AI. But it is a community-based project where, yes, there are
298
00:31:28,360 --> 00:31:34,520
compute consequences. And I think they're wrestling with how can they use solar? How
299
00:31:34,520 --> 00:31:40,600
can they use other renewable energies for their projects? But it is done in conjunction with
300
00:31:40,600 --> 00:31:47,640
the community. And so they've been approached by, I think, OpenAI and Google and others
301
00:31:47,640 --> 00:31:53,880
to kind of hand over their model or to help them train their other models, these automatic
302
00:31:54,760 --> 00:32:01,480
speech technologies. And they've said no. And so I think, again, we're thinking about assistive
303
00:32:01,480 --> 00:32:08,440
technologies in schools or in other spaces. And again, you mentioned these sort of DIY,
304
00:32:08,440 --> 00:32:15,640
let's fund the DIY groups and learn from one another in a more decentralized way rather than
305
00:32:15,640 --> 00:32:23,320
concentrate the funding and the compute power in the hands of three companies. Or like NVIDIA is
306
00:32:23,320 --> 00:32:31,320
the only company who's making the chips to run these technologies. So that, again, I think it
307
00:32:31,320 --> 00:32:38,280
is in the spirit of the history of glutites. And again, they may not call themselves,
308
00:32:38,280 --> 00:32:42,600
these Maori folks doing this work, they may not call themselves glutites. And I think that's
309
00:32:42,600 --> 00:32:49,080
something to acknowledge, too, that if these sets of ideas are helpful, I think that's one of the
310
00:32:49,080 --> 00:32:53,720
arguments we make, then great. But we also, you know, the glutites, and predominantly the people
311
00:32:53,720 --> 00:32:58,360
who have claimed that are white men. And so that is something to acknowledge and something to
312
00:32:58,360 --> 00:33:03,240
grapple with. And even today, I mean, I think, you know, that the group has become more expansive
313
00:33:03,240 --> 00:33:09,320
over time. So I think that's important to acknowledge. But I think that those tactics,
314
00:33:09,320 --> 00:33:15,160
the notion of playfulness, of organizing, of sort of community-based solutions to technology
315
00:33:15,160 --> 00:33:20,520
that embrace technology, and work with people on the ground to think about what are problems that
316
00:33:20,520 --> 00:33:27,960
we as a group face, and how can we use our traditional knowledge to address these problems
317
00:33:27,960 --> 00:33:34,040
without then, you know, having to rely on the Googles and, you know, the metas of the world.
318
00:33:34,040 --> 00:33:38,840
That, I think, is a promising example, is the best example that I've come across
319
00:33:38,840 --> 00:33:46,600
that I can point to to say it's not either we have AI or we don't. But it's, again,
320
00:33:47,240 --> 00:33:55,160
these sorts of very grounded projects in community that I think offer a third way forward that I
321
00:33:55,160 --> 00:34:00,840
think also is in the spirit of glutism. I think that's a great point to bring it back to
322
00:34:01,400 --> 00:34:06,600
not the notion of technology, but the notion of power really situated in the work of the
323
00:34:06,600 --> 00:34:12,200
Luddites in really all, perhaps each of those waves. So I'm wondering if you have, you've
324
00:34:12,200 --> 00:34:18,920
mentioned a whole bunch of different authors, podcasts, anything else. Are there some highlights
325
00:34:18,920 --> 00:34:26,040
or are there like your top tier hits for people to go find out and learn more about these ideas?
326
00:34:26,360 --> 00:34:31,160
Where would you point people who want to kind of learn more, perhaps join the third wave of
327
00:34:31,160 --> 00:34:37,880
Luddism? Yeah, so someone who's been writing about Luddites for a while is Zachary Loeb,
328
00:34:37,880 --> 00:34:45,320
and he publishes under the name of Librarian Shipwreck. And that is on social media, and he
329
00:34:45,320 --> 00:34:54,920
has a blog. And he's got a few posts where he also has like a top 15 readings on Luddites,
330
00:34:55,000 --> 00:35:00,680
because I'm not an expert. Let's be clear, I've done some light reading. And so hopefully,
331
00:35:00,680 --> 00:35:06,280
if any Luddite experts out there are listening to this, please take anything I say with an asterisk
332
00:35:06,280 --> 00:35:13,880
next to it. Brian Merchant's book that came out that Phil and Tero and I reviewed for the LA
333
00:35:13,880 --> 00:35:18,200
Review of Books is called Blood and Machine. I have it here in front of me, The Origins of
334
00:35:18,200 --> 00:35:24,200
the Rebellion Against Big Tech. That one's great. And that gives you, again, sort of on the ground,
335
00:35:24,200 --> 00:35:29,080
historical Luddites, but then he fast forwards to contemporary times and looks at different,
336
00:35:29,080 --> 00:35:35,320
sort of like the gig work economy and draws parallels between the kinds of organizing
337
00:35:35,320 --> 00:35:41,480
and resistance that contemporary gig workers are doing. Gavin Mueller has a book that came out a
338
00:35:41,480 --> 00:35:48,040
few years ago called Breaking Things at Work. And that's another one. So I'd say, start with
339
00:35:48,040 --> 00:35:54,120
those three. And if you really want to do deep dives, folks like David Noble, who's a historian
340
00:35:54,120 --> 00:36:02,760
and tech critic and others, there's no shortage of work on the Luddites. And I think that is
341
00:36:02,760 --> 00:36:08,520
in part because they've become a source of interest, again, really only in the last 100 years,
342
00:36:08,520 --> 00:36:14,840
as historians have sought to kind of rethink and reframe who the Luddites were and the kinds of
343
00:36:14,840 --> 00:36:19,160
political projects that they were engaged in. Because there is, I think, some debate about
344
00:36:19,800 --> 00:36:25,560
how organized and how much of like a political project around unionizing and as like a political
345
00:36:25,560 --> 00:36:30,360
class. And so if you're into that kind of stuff, there's certainly reading for you to find.
346
00:36:31,320 --> 00:36:32,920
Thanks so much, Charles, for joining me today.
347
00:36:33,720 --> 00:36:35,560
Yeah, yeah. Thank you, Nick. I really appreciate it.
348
00:36:40,120 --> 00:36:43,880
Thank you again for listening to our podcast at Human Restoration Project. I hope this
349
00:36:43,880 --> 00:36:48,200
conversation leaves you inspired and ready to start making change. If you enjoyed listening,
350
00:36:48,200 --> 00:36:51,720
please consider leaving us a review on your favorite podcast player. Plus,
351
00:36:51,720 --> 00:36:56,200
find a whole host of free resources, writings, and other podcasts all for free on our website,
352
00:36:56,200 --> 00:37:05,240
humanrestorationproject.org. Thank you.
Should We Be More Like The Luddites?
Inspiration from the Luddites: On Brian Merchant’s “Blood in the Machine”
Record being placed on a record player.wav by HelterSkelter1114 -- https://freesound.org/s/409036/ -- License: Attribution NonCommercial 4.0
rope-making machinery running.wav by phonoflora -- https://freesound.org/s/201166/ -- License: Attribution 4.0