How will artificial intelligence affect learning, schooling, and the future of work? Many interesting conversations are taking place around these questions. Here are a few things I’m noticing and thinking.
Large-Language Models as Super-Tutors
One-on-one tutoring has traditionally been enjoyed by the children of the wealthy. Will this change with large-language model AIs (like ChatGPT)? Watching a demo of Khan Academy fueled by GPT-4 offers hope. But we’ve also been here before. A dozen years ago, many people got very excited about MOOCs (Massive Open Online Courses). The doors to top universities have been flung open! The province of the elite is now accessible to all! College degrees are on their way out! But in the end, few young people want to sit around and watch online lectures. MOOCs proved most valuable to adult learners from other countries (like India) and the higher education system itself (as a cost-saving measure). MOOCs became just another tool in a self-directed learner’s arsenal. I think the same will happen for text-based AI tutors.
It’s easy to get excited about scalable technologies. What’s much more difficult to scale is a culture that incentivizes young people to make use of these tools. As Henrik Karlsson writes, “If you live in a subculture where other things are more valued than intellectual growth (as is true of the vast majority of the youth cultures we exile our teenagers to), there will be limited social incentive to leverage tutoring and other opportunities to grow excellence. It is simply not what will give you status in that culture; it is not what you desire there, what makes you feel safe, what provides you with a stable and socially validated role.”
Culture is transmitted by close human relationships. Software like ChatGPT cannot develop human bonds. It cannot help a young person overcome emotional blocks, instill confidence, serve as role model, or offer a hug. Perhaps when we receive a truly effective, speech-based AI therapist—one that can empathize and coach its way right past the Turing test—then we’ll be on the verge of AI super-tutors. Until then, and even then, I believe more experienced humans will be needed to show less experienced humans how, and why, to use such tools.
But the Kids Will Use the Technologies!
As a general rule, I ignore every concern that “this new technology will make it easier for students to cheat on tests/assignments!” That’s your problem, teacher, not your student’s. Get with the times. If your students are using ChatGPT to write passable 5-paragraph essays, stop assigning 5-paragraph essays. Any assignment that can be completed by ChatGPT is the new horse and buggy. As Lex Fridman quipped in a conversation with Sam Altman: “If AI is going to take your programming job, it means you’re a shitty programmer.” If your job as an educator is threatened by AI, it means… it’s time to adapt. The point of formal education is to serve and challenge the student, not to propagate current managerial or evaluation methods for the convenience of adults.
Outside of formal education, virtually every “test” is open-book, open-phone, and open-tech. When real people have a real job to do, they use every tool in the toolbox. I put my trust in schools (and other educational approaches) that happily harness new information technologies, encourage adults to explore them alongside students, and develop new assessment methods as needed, ideally with student input. Thus do educators stay relevant instead of becoming redundant.
What’s an Educator For?
Middle-school educator Chris Balme argues that the oncoming “third wave” of education won’t be about memorizing and spitting out content in competition with AI, but rather teaching young people how better to be human. For example, teaching:
- embodiment (body awareness, mindfulness, concentration, nutrition, exercise)
- emotional awareness and emotional management
- how to develop original, creative works (with the assistance of new technologies)
- how to discover a sense of purpose and develop meaningful quests for oneself
- how to connect with humans IRL, develop rapport, and have deeper conversations
I don’t feel personally threatened by AI because my work centers on travel, outdoor adventure, and developing high-level social skills, confidence, and sense of purpose. If an AI can take a group of teens on a multi-week international voyage, keep them safe and emotionally supported, while appropriately pushing their boundaries and encouraging growth at the same time—I want to shake that AI’s virtual hand.
This is why focusing on adventure feels like a worthy effort. People of all ages crave adventure, but especially young people. Adventure thrusts us into confrontation with the messy spheres of wilderness, travel, foreign culture, personal barriers, and complex human relationships. There’s no “app for that,” nor will there be. Whether we’re talking about human-powered pilgrimages or the splendid uselessness of starting a punk band or becoming a bird expert, AI will not take more than an assistant role in the pushing of human boundaries, of discovering new forms of value and meaning.
I’m optimistic about a world in which smart technologies take over increasingly more “essential” tasks, leading us to find dignity and purpose beyond algorithmic work. Unschooling and related philosophies may be ahead of the curve, in this respect, by preparing us to flourish through leisure.
Young people still need educators to create opportunities, curate resources, convene groups, and set behavioral norms. They need us to nurture, support, challenge, and provoke them. And they need our assistance in doing the deep work of adding to the library of human understanding: the same library which ChatGPT has mined so well, and which makes it seem so smart in the first place.