Skip to main content

How Education Works: Epilogue

How Education Works
Epilogue
    • Notifications
    • Privacy

“Epilogue” in “How Education Works”

Epilogue

A statue of John Henry holding a large hammer.

In American legend, John Henry competed with a steam drill to drive spikes into the ground and won, albeit at the cost of his own life, his heart bursting at the moment of victory. His story resonates with those who view technologies as threatening and alien, as competitors to humans. But, as I hope that I have demonstrated, his orchestration of tools was as much a technology as the steam drill, and far from being “other” our technologies must and should include ourselves. The difference between Henry and the steam drill lay ostensibly in the orchestration of phenomena. Were the ends to which they were put the same? Perhaps not. For Henry, there was meaning and value in the accomplishment of the task, not just in the task itself, a meaning so important to him that it cost him his life. We are and should be concerned when the orchestration of phenomena is embedded in machines—instantiated by whatever or whomever—and is no longer the purview of people, because sometimes that orchestration is an important part of what defines our identities, and often the purposes run far deeper than those that dazzle us on the surface. Performing a task with skill, ingenuity, creativity, or simply strength is deeply entangled with our sense of self-worth, part of our identities, and a characteristic delight of being human. This is obviously true of artistic skills, but it is also true of many mechanical tasks in which humans are part of a hard orchestration.

There are countless benefits of simply doing, living, and being. There is value in doing a simple thing well, even when we know that it can be done better, faster, and more efficiently by machines, even when the output is no different from (and likely objectively worse than) what would be achieved by those machines. The ends to which we orchestrate phenomena are not always as straightforward and utilitarian as they might seem to be at first. In all things, no matter how mundane, playing the game usually matters more than getting to the end of it, and the rules of the game might not be those that we see most easily. My grandson’s gingerbread house might not be as objectively well built as one assembled by a professional bakery, but I would not swap one for the other. Technologies are often concerned with making our lives and the things that we do with our lives better, but this does not always mean that they should be faster and more efficient, reliable, or cost effective. We should choose what we give up to the machine with inordinate care. This is especially true of education, which is above all about ways of being in human society.

It is equally vital to remove the shackles of prescriptive technologies of which we are unwilling parts. To be an unwilling cog in a machine is a very different state of being than to play the part of a machine out of choice. It is not always obvious when this is happening, and we should constantly be alert to the possibility. When, for example, we trust in an AI tutor to guide us, or accept the verdict of a learning analytics engine that tells us what we should do (or what we are doing wrong), we run the risk of becoming parts of its circuitry rather than it being a part of ours. When the design of an LMS, a timetable, or a classroom encourages us to take the easy path, we are letting ourselves be parts of it rather than orchestrating it as part of us. As soon as we become aware of this, we can take steps to change it: we can assemble it into our own orchestrations as just more stuff in the assembly, or replace it, or (sometimes) modify it. Perhaps we will even accept it as a good or necessary thing.

Many rules that we follow exist for good reasons that benefit not just us but also everyone, and some machines (e.g., the scripts that novice teachers might have to follow or the scales that musicians choose to practise) can help us to learn or do what we want to do more effectively. This is fine as long as we are aware of it, can take ownership of it as part of who we are, and see our place within the technology as part of what it is. The dynamic of the hard influencing the soft more than vice versa is a given that we cannot change, but the hard does not necessarily entail the soft: it enables it. This means that we can adapt our responses in many different ways. But we must never forget that learning is an essentially human process in which technologies mediate, facilitate, and engender but in which the creative, feeling, value-filled human (and the human’s society) is and must always be at the centre.

Education is about becoming the best humans that we can be, in a human context, with other humans. We should not blindly learn to be human from a hard machine, even though (and perhaps especially because) that machine might, such as through a generative AI like ChatGPT or Google’s LaMDA, embody the thoughts, beliefs, and processes of other humans. A hard machine—even one enacted by people—has no dreams, desires, beliefs, values, or purpose beyond that of its creator or manager, so we need to be wary. Let humans teach humans with, through, about, and to be technologies, of course, but where decisions are at least moderated by a person or people. The parts can and often should be hard, but the assembly and final orchestration must be soft.

Is this book a technology? Of course. It has been many. Some of these technologies are obvious: I am using language, mediated through print (on a paper page, on an e-reader, or perhaps with text-to-speech software) to put a point or two across to achieve some purpose. Some of the technologies are less obvious, such as the book’s structure, its approach to building arguments, the models and theories that the book expounds. It is a soft technology, both to me as the writer and to you as the reader, using skill in your interpretation and judgment as well as creativity in how you assemble it with other things that you know. My version of this technology is worlds removed from yours. What it orchestrates for me, and its uses, are utterly different from what it orchestrates for you and the uses to which you will put it. That is exactly as it should be. It is a rich assembly, orchestrated by many people, most of all by you.

If this book has helped you, however slightly, to think about what you know and how you have come to know it a little differently, then it has been a successful learning technology. In fact, even if you hold to all of your previous beliefs and this book has challenged you to defend them, then it has worked just fine too. Even if you disagreed with or misunderstood everything that I said, and even if you disliked the way that I presented it, it might still have been an effective learning technology, even though the learning that I hoped for did not come about. But I am not the one who matters the most here. This is layer upon layer of technology, and in some sense, for some technology, it has done what that technology should do. The book has conveyed words that, even if not understood as I intended them to be, even if not accepted, even if rabidly disagreed with, have done something for your learning. You are a different person now from the person you were when you started reading this book because everything that we do changes us. I do not know how it has changed you, but your mind is not the same as it was before, and ultimately the collectives in which you participate will not be the same either. The technology of print production, a spoken word, a pattern of pixels on a screen, or dots on a braille reader has, I hope, enabled you, at least on occasion, to think, criticize, acknowledge, recognize, synthesize, and react in ways that might have some value in consolidating or extending or even changing what you already know. As a result of bits and bytes flowing over an ether from my fingertips to whatever this page might be to you, knowledge (however obscure or counter to my intentions) has been created in the world, and learning has happened. For all the complexities and issues that emerge from that simple fact, one thing is absolutely certain: this is good.

Next Chapter
References
PreviousNext
This work is licensed under a Creative Commons License (CC BY-NC-ND 4.0). It may be reproduced for non-commercial purposes, provided that the original author is credited.
Powered by Manifold Scholarship. Learn more at
Opens in new tab or windowmanifoldapp.org
Manifold uses cookies

We use cookies to analyze our traffic. Please decide if you are willing to accept cookies from our website. You can change this setting anytime in Privacy Settings.