“5. Participation and Technique” in “How Education Works”
5 | Participation and Technique
A man provided with paper, pencil, and rubber, and subject to strict discipline, is in effect a universal machine.
—Alan Turing (1948, p. 113)
Virtually all technologies demand some kind of action or activity from us, from turning a dial to painting a masterpiece. We are not just using technologies when we do this: the use itself is a form of technology, a process that we enact so that the technology of which it is a part can perform the purpose that we wish for it. Thus, we do not just use technologies but also—as they are instantiated in specific contexts—participate in them. We are part of them, as either elements of a predetermined mechanism or as assemblers and orchestrators of something novel (and, typically, both). This participation can usually be described as “technique.” As I use the term, “technique” is the human-enacted part of any technology. An important feature of many technologies, and especially educational technologies, is that we are co-participants, directly or indirectly, with other people.
Sometimes our participation in a technology is predetermined, from actions as simple as pressing a button or operating a microwave oven to activities as complicated as correctly singing a piece of music or solving a differential equation. The techniques that we use must be enacted with precision, or the technology will fail to work. I will describe these as hard technologies.
Often, however, we have to (or choose to) perform some of our own organization in order for the technology to do what we ask of it. Technologies, from pencil and paper to learning management systems, leave a great deal unorchestrated, yet to unfold. They leave gaps that must or may be filled idiosyncratically and, often, creatively by their participants. The techniques that we use can vary considerably, potentially differing each time we use them. I will describe these as soft technologies. It is important to take note, though, of where the boundaries of such technologies lie because the softness is not inherent in the parts but unfolds only in the whole. What makes it softer or harder, in its enacted unfolding, is the role of humans in the overall assembly—their techniques—not the other parts of which the assembly consists. I will describe this definition of softness and hardness as participatory, reflecting the fact that the softness or hardness describes the kind of participation required of or enabled for the human (or other intelligent) participants. Before explaining this and its consequences further, I should distinguish my own from other common uses of these terms because, otherwise, preconceptions of earlier definitions of hard and soft technologies might make the rest of the book difficult to follow. In the process, I hope to demonstrate that the participatory definition is more useful than its predecessors.
Soft Technologies and Hard Technologies
Many authors have found it useful to divide the world of technologies into those that are hard and those that are soft. Most definitions fall into one of three main camps, which I label here as binary, socio-technical, and holistic, with respective foci on phenomena, use, and orchestration. Table 1 provides an overview of each definition, as well as my own, that characterizes the major differences and similarities.
The Binary Definition
The binary hard/soft distinction is a simple and everyday way of separating those technologies concerned primarily with human-mediated human processes from those that use physical (including digital) tools (e.g., Bessant & Francis, 2005; Burgess & Gules, 1998; Hlupic et al., 2002; McDonough & Khan, 1996). This, roughly, is the definition employed by the Association for Educational and Communication Technologies (AECT) in making its own soft/hard distinction with regard to learning technologies (Lakhana, 2014), so it is frequently used in educational literature. It is also implicit in the common use of the terms “software” and “hardware.” From this perspective, rules, theories, methods, managerial systems, and exam procedures are soft, whereas anything embodied in hardware or software (oddly enough)—including classrooms, whiteboards, and LMSs—is hard. The distinction might have some value in (and, for the most part, only in) management accounting, because essentially it has to do with whether technologies can be bought or sold. However, it fails abysmally to acknowledge that most technologies are mixes of the two (like the stick on the ground), and it leads to some nonsensical categorizations: for instance, a verbal quiz would be soft by this definition, but the same quiz online or on paper would be hard. That is not a useful distinction unless you are charged with accounting for your use of paper. It also seems to be unintuitive that a paintbrush is a hard technology whereas a rigid rule that cannot be broken is a soft technology. This is a confused and confusing definition if the intention is to describe a technology, and it is not the one that I will use.
Definition | Pattern | Primary technology focus | Soft example | Hard example |
---|---|---|---|---|
Binary | Physical tools versus business processes | Phenomena | Exam regulations | Pencils |
Socio-technical | Liberative versus dominative technologies | Uses | Pencils | Exam regulations |
Holistic | Technologies and humans as part of one assembly | Orchestration | Drawing | Exam boards |
Participatory | Flexible human roles in technologies versus inflexible roles | Phenomena, uses, orchestration, and users | Drawing | Multiple-choice quizzes |
The Socio-Technical Definition
Another common use of the soft/hard distinction for technology is concerned with how technologies affect us rather than their constitution. I call this the socio-technical definition. From the socio-technical point of view, softer technologies, however they are instantiated, are empowering, whereas harder technologies are disempowering, demanding that we must behave in particular ways to service their needs. As Don Norman (1993, p. 232) puts it, “hard technology makes us subservient, soft technology puts us in charge.” Terms other than “hard” and “soft” are used by some writers to describe similar concepts, such as Franklin’s (1999) distinction between “holistic” and “prescriptive” technologies or Boyd’s (1996) distinction between “liberative” and “dominative” technologies. Baldwin and Brand (1978, p. 5) are thinking along similar but not identical lines when they say that “‘soft’ signifies something that is alive, resilient, adaptive, maybe even lovable.” Although not explicitly defined, hard technologies, presumably, are none of those things. For them, softness relates to technologies that exist at a human scale, fitting local needs rather than organizational needs and acting for the benefit of all—including the environment—rather than the benefit of a few. For them, a bicycle or public transit system might be soft, whereas cars (and all their unequally distributed, environmentally destructive infrastructure) might be hard.
The socio-technical perspective recognizes the complexities that occur when we shape technologies and are shaped by them, the dialogue that occurs between designer and user, and the role that technologies play in shaping our working lives, our education, and our ways of being. All technologies are value laden; most behave in hard-to-predict ways when assembled, they normally cause harmful side effects, and all are deeply intertwingled with many facets of our individual and collective lives. However, for one person, what is a hard, opaque, and ugly technology that restricts patterns of behaviour can often be, for another person, a liberating technology that opens up vistas of creative possibility. Many educational technologies are liberative for some but dominative for others. For some, the LMS is a liberating technology that extends reach and pedagogical vocabulary, whereas for others it is a repressive instrument of domination and uniformity. For most, it is somewhere between the two, frustrating when it prevents some intention, liberating when it reveals hitherto unnoticed ways of teaching. I have frequently used pedagogies that deeply inspire some students but leave others quaking in fear because of the agency that they are forced to embrace. Most probably find these pedagogies to be somewhere between the two definitions, and few would agree on the balance.
The socio-technical definition tells us little about the constitution of such technologies because it is much more concerned with their use than with their orchestration. This does have some value in understanding technology roles in socio-technical systems. It is also a useful perspective when designing systems and tools that people will actually use and that will not cause harm. This is important in an educational context, in which students are often required to follow a rigid process toward accreditation and often subjected to highly dominative and prescriptive methods of teaching both in the classroom and online. However, when we look closely at most technologies, a certain amount of hardness—in the sense of dominative and prescriptive effects—is inevitable and far from harmful, and there are technologies, from water and sewage management to protective legislation, that appear to be hard (from a socio-technical perspective) yet mainly are beneficial. It is a useful distinction, but the terms tell us little about the technology in question, and it is not how I will use them.
The Holistic Definition
For some writers, the distinction between softness and hardness, like the binary distinction, is concerned with the constitution of the technologies themselves, but in this definition humans and their intentions are what make them soft, whereas the lack of them makes them hard. It is thus a way of looking at both the technologies themselves and our intimate relationships with them. Like the socio-technical perspective, it is concerned with a continuum of softness to hardness—representing different levels of human engagement in them—but its focus is more on their orchestration and the roles that humans play in making them work. I will describe this as a holistic view because of its treatment of the entirety of the technology assembly, including the people using the technologies and the construction of the technologies themselves. For example, to Zhouying (2004), soft technologies are concerned with the human factors that are a necessary adjunct to harder processes and tools, relating to psychology, ways of thinking, and ways of using those tools. Laszlo (2003) describes hard technologies as physical embodiments of technologies and/or technological processes and methods, whereas soft technologies represent the support for individual and collective self-determination—design methodologies, decision-making processes, and so on—that they enable. Like the binary definition, the holistic definition allows that technologies can be almost anything created but recognizes that some processes are distinctly human, whether or not the technologies are physically embodied, and, like the socio-technical view, it considers the affective nature of technologies for both people and their organizations.
A more holistic perspective takes us beyond the affective definition of socio-technical perspectives, and it offers a more realistic and nuanced way of understanding the complex assemblies that form our technologies than the binary view. It is the closest of all families of definition to my own. However, it runs the risk of providing a definition of soft technologies that few would recognize as technologies at all. Although (following Arthur, 2009) psychological factors can indeed be phenomena in an assembly that are necessary for a technology to perform its job, they are no more technologies in themselves than the passion of an artist or the sensitivity of a musician; instead, they are features that describe users of those technologies and the impacts of those users on their use. Soft, yes, and significant phenomena in many technology assemblies, but not technologies, because they exist whether we incorporate them into a technological assembly or not. So, though the general principles behind the holistic definition are laudable and rich in their application, and this definition allows us to examine both phenomena and their orchestration as parts of a single whole, it goes a little too far in including the non-technological, and thus its value as a differentiator is undermined, especially if we take on board the complex nature of technologies as usually being assemblies of multiple technologies.
The Participatory Definition
My participatory definition takes into account Arthur’s (2009) insights into how technologies are formed and evolve through orchestrated assemblies. My definition of softness or hardness is essentially a measure of the degree to which humans participate in the orchestration of the final assembly: it is a description of the parts that we play in making the technology happen. Humans play predetermined roles in the orchestration when they are parts of hard technologies, whereas in soft technologies humans are the orchestrators. Hard technologies operate in fixed, invariable ways, whether or not they are physically instantiated, whereas soft technologies are pliable, relying on humans to engage in ever new ways of enacting them. This is more in keeping with common uses of the English terms “soft” and “hard” because softer technologies (demanding that orchestration be performed by their participants) are consequently more pliable, malleable, and giving, whereas harder technologies (in which humans play fixed and invariant roles) are consequently more rigid, more resistant to change, and more brittle. The participatory definition embraces phenomena, orchestration, and use as indivisible contributors to the same assembly. The softest technologies by this definition—those involving pencils, say—can be performed in almost infinite ways, whereas the hardest technologies—production lines or standardized tests, say—must be performed in the same way each time to achieve their intended purpose, as long as they work.
There is a continuum between softness and hardness. This is because almost all technologies are assemblies that consist of soft and hard technologies, and the human role in enacting any of those parts can range from completely proscribed to almost unconstrained and usually is a rich combination of the two. For example, when we write, we must obey more or less hard rules of spelling, citation practice, punctuation, and so on and we must use the physical tools with which we write in fairly proscribed ways. However, there are limitless possibilities for creative expression and invention, and most things that we write will not have been written before. Hardness and (especially) softness are not characteristics of the parts of the assembly: they are characteristics of the whole, as it is enacted by one or more humans, in a real-life setting.
The vast majority of technologies are blends of this nature, often in complex ways, once they are brought together for some use. Typically there are many more parts to an assembly and many ways that the assembly itself is part of other assemblies, often involving iterative and recursive loops. For instance, teachers may use a hard lecture format, but students (performing their own sense-making orchestrations) may ask questions or look excited, then teachers may use those phenomena to soften what they do, which might incorporate, say, showing a hard video demonstration, and so on. As Fawns (2022) puts it, the phenomena, orchestrations, and uses are deeply entangled: they are mutually affective and ever shifting over time.
Figure 2 shows a few technologies that might be found in academic environments, listed in rough order from soft to hard. However, the order of this list can change considerably for different people, different orchestrations, different uses, and different assemblies that incorporate or might be part of these technologies. The pliability of a technology, for the most part, is a highly situated phenomenon that depends on the phenomena, the assembly, the orchestration, the use, and (above all) the way that the person uses it at the point that it is enacted. Only rarely is it a fixed aspect of a named technology in itself. Even a pencil can be hard if we are forced to use it in a particular way (e.g., to draw a straight line between two points), and, to its creator, regardless of its inflexibility to its operators, a production line can be soft. A great deal depends on where we choose to place the boundaries around the technology of interest. The boundaries of a soft technology always extend beyond the components (including methods, tools, natural phenomena, structures, etc.) of which it is composed, and extend fuzzily toward infinity, limited only by the imaginations of the participants. The description of a hard technology perfectly encompasses it and people’s roles within it. This does not mean that it cannot, through assembly, become part of another, softer (or harder) technology, of course, because technologies are assemblies that can become parts of other assemblies. Again, in labelling a technology as softer or harder, we are describing human roles in enacting that technology, not its parts.
The distinction is similar to that between hard and soft disciplines in academia: put simplistically, hard disciplines can be seen as those with right answers, whereas soft disciplines can be seen as those with many possible good answers. Hard technologies, as I am defining them, are invariant, always behaving in the same way no matter how they are instantiated, whereas soft technologies can be enacted in many ways. The more room left for humans to play their roles differently, the softer the technology.
This use of hard and soft is akin to that used by Checkland (2000) in his soft systems methodology (SSM), which treats hard systems as relating to well-defined problems and soft systems as those applying to fuzzy and ill-defined situations demanding dynamic adaptation and creativity. The participatory definition differs inasmuch as its application is to the nature of technologies rather than to the analysis and design of them (by my definition, SSM itself is a soft technology). SSM is concerned with understanding complex systems, leading to ways that we might go about changing them. Like the holistic definition, Checkland’s soft systems can include attitudes and values. The participatory definition is about the results of doing so: the technologies themselves. In my use of the term, the “fuzziness” of a soft technology is a temporary state that resolves into a concrete system when it is instantiated. By my definition, a soft systems design process can lead to a hard technology, while a hard systems methodology may lead to the design of a soft technology. The biggest difference between a soft technology and a hard technology is that part of the former is unknown and, typically, unknowable in advance, not that it is inherently blurry once it is instantiated.
Turkle and Papert (1992) make a similar distinction between a hard engineering approach to design, involving planning and a rigorous design model, and soft bricolage (or tinkering), engagement with the concrete in which a dialogue is enacted between the creator and the technology created. Like me, Turkle and Papert are concerned with the relative degrees of human engagement in the process and, more than Checkland (2000), interested in the product as much as the process. However, their interest in the product lies in how it differs (internally) as a result of a soft or hard design process. This might be of no consequence to end instantiators (users) of the resulting technologies. By my definition, the hardness or softness of those resulting technologies has little to do with whether they were built by engineering or bricolage. It is just as easy to enact a technology that is restrictive, inflexible, and resistant to change using bricolage as it is through engineering, and some of the softest, most pliable technologies in the world (e.g., most of those involving computers or pens) are more likely to be made by engineers than bricoleurs.
Finally, there is a connection between this definition and what Bijker (1987) describes as “interpretive flexibility”: that is, the ways that different technologies can be adapted or appropriated for different contexts. However, Bijker is concerned with the environmental, economic, cultural, and social conditions under which technologies can be adapted and appropriated, and particularly with how classes of technology can be adopted within a society, whereas my distinction concerns the nature of the technologies—as concretely enacted in particular situations—themselves. My understanding of technology is more closely related to actor network theory (Latour, 2005; Law, 1992) and activity theory (Engeström, 1999), inasmuch as it conceives of technologies as inextricable parts of human action and knowledge, but there the similarities mostly end.
The Enactment of Technologies
The softness of a soft technology comes from the phenomena being orchestrated anew each time it is enacted by a human or humans, whereas in a hard technology that orchestration has already been determined, whether or not it is actually enacted by humans. I use the word enacted to emphasize that technologies often are as much performed as they perform, whether by people or machines. When people perform them, they are participants in the technology, not just users of it: the techniques that they use are parts of the technology’s complete assembly. Another way of thinking about it is that technologies can be “realized,” in the sense of being made real, by humans as well as machines. We might say equally that they are instantiated, in the sense that they do not fully come into being until they are used within a specific context for a specific purpose. Technologies such as mental arithmetic, thinking in words, or just singing in our heads can be enacted (or realized or instantiated) entirely by people, whether they are hard or soft, though in many cases the enacted assembly typically includes something more tangible, be it software, bricks, pencils, fingers, or ink.
The participants’ role in soft technologies is variable and often creative, whereas their role in hard technologies is predetermined, predefined, and (if successful) invariant. This is not always related obviously to the component parts. For instance, unconstrained classroom teachers may use words in many unique ways (as a soft technology), but teachers following a script to say the same words would have no choice about their part in what appears to be the same assembly, assuming that they stick to the script. Greater softness can emerge in many other ways, such as tone of voice, expression, pacing, accent, and so on: there is still plenty of room for soft technique. Inevitably, because they reify human decision making in something non-human (be it a bearing system or a set of explicit rules that may not be broken), harder technologies tend to be more constraining and authoritarian, whereas softer technologies tend toward creativity and flexibility. Soft technologies are thus often more “alive, resilient, adaptive, maybe even lovable,” as Brand (1978, p. 5) suggests, whereas hard technologies tend to be prescriptive, making us subservient to their needs. However, hard technologies often do good (and, indeed, are essential since almost invariably they form parts of the assemblies of soft technologies), while soft technologies can cause harm as easily as good. There are many soft technologies of war and slavery that are far from lovable.
An archetypal hard technology such as an old-fashioned, mechanical, spring-driven wristwatch mostly performs its role independently of any intercession but demands that a human, from time to time, must wind it. Creative watch winding, if the watch is meant as a timepiece, is not recommended by the manufacturer. For the mechanical watch, at least when used as a time-telling machine, the human is a part of the orchestration, a necessary component of its assembly without which it simply will not work and in which the human has no choice but to act in a particular manner if it is to fulfill its function. There might be some small element of personal technique involved: twiddling back and forth faster or slower, for instance, or slowing down as it approaches being fully wound, or variations in ways of gripping the crown that make it easier or harder to wind, but all are just parts of the orchestration, implementations of a predefined method. The human role in the hard technology of a watch also includes reading the positions of the hands and using them to calculate the time. Creative interpretation of hand positions is rarely a good idea if the intention is to tell the time accurately. If it is to work as an accurate time-telling technology, then the person who uses it is a necessary part of a complete description of its assembly, and a complete description of the technology of telling time (including its human parts) is possible. It is also possible for the watch simultaneously to be part of many other technologies that can be equally hard (e.g., when used as part of a technology to identify location or direction) or softer (e.g., when used to indicate social status, or as a decoration, or as a metaphor in a play). The softness is inherent not in the watch, but in the way (with a human or humans) that it becomes part of the enactment of the technology that matters—the technology as it unfolds, as it is instantiated, as it is realized—be it timekeeping or status signalling. And, of course, it can be both at once but, when it is, there are two different technologies of interest, not one.
Like the mechanical wristwatch, the pencil and paper are useless without a human, but unlike the watch the human must orchestrate the phenomena that they provide, assembling those phenomena with other phenomena if they are to do anything at all. Many of those other phenomena are technologies themselves, such as methods of handling the pencil, rules of perspective, spelling, rules of grammar, and so on, as well as non-technological phenomena, such as suppositions about how signs made on the paper can influence or affect other people. There can also be some soft skills involved, such as imagery and metaphor. Without those extra parts of the assembly the pencil and paper are not just non-functioning but also functionally incomplete. They do not lack just one or two parts but are inherently open to becoming many different technologies—portraits, shopping lists, calculations, architectural plans, games, and so on—not to mention a host of entirely orthogonal technologies (e.g., a toy windmill can be made from nothing but folded and torn paper and a pencil). Separately, their adjacent possibles are at least as great. A pencil can be used equally as a stabbing implement, a coffee stir stick, a measuring device, part of an artwork, a table prop, a filler of a hole in a wall, a maker of a hole in a wall, and so on. A piece of paper can become an airplane, a dustpan, a fan, a means to wipe up a small spill, a sunshade, a coaster, a fire lighter, a hat, a filler of a hole in a wall, and so on. A complete list of all the possible ways in which these technologies can be used would be impossible to compile, both in principle and in practice.
Although pencil and paper are simple technologies, simplicity is not a prerequisite for softness. A school building, for example, is complicated but, though built mainly for the purpose of teaching, can be used for an indefinitely large number of purposes in any number of technology assemblies, from a voting booth to a bomb shelter, from emergency housing to a place of worship. At least to their programmers, computers are perhaps even softer technologies than pencils or schools, with even more possible uses, and more possible ways that they can be orchestrated or assembled, though they are among the most complex objects ever manufactured. No matter the complexity of a technology, if it allows or requires humans to use it in their own orchestrations, then it can be described as soft.
Many technologies can be soft or hard depending on use, phenomena, and orchestration. A screwdriver, for instance, can be used with precision as a hard technology to correctly drive a screw. However, as Kauffman (2008) shows, there are no limits to its other possible uses, including murder, paint stirring, and back scratching.
Perspectives and Points of View
A screwdriver can be soft or hard because humans are a necessary part of any technology in which it plays a part. This is true of almost any tool because, by definition, a tool is used by someone. Different users of the same tool may put it to different uses and organize different stuff to do so, and they may thus use an entirely different technology from one another, even though significant (and often the most visible) parts might appear to be the same. The classroom, for example, is a different technology (with varying degrees of hardness) for a teacher, a student, an administrator, and a principal because they are users of very different (though overlapping) phenomena and put them to very different uses. Similarly, to the creator of an online form, it is usually soft, but for someone required to fill it in it can be hard, and the use of the form will be different for each of them. Each will orchestrate different phenomena for different purposes.
Skill matters too. For example, the fact that I might be able to modify the code of the software that produces the online form makes it a much softer technology for me than for someone without such skill. However, this is also subject to a range of other technological constraints, such as permission to access the system where it is installed, its licence, and access to suitable software to upload the code, not to mention the time that it would take to write the code, any or all of which can be more significant in determining its softness or hardness than the software itself. What we conventionally label a single technology—Moodle, say, or Blackboard—is (when used) often far from it, depending on the boundaries that we choose and the perspective from which we approach it. Indeed, for a system of such complexity, full of softness as well as hardness in its assembly, it can be part of a different technology for every person who enacts or participates in the enactment of it.
Softness and hardness can occur in different parts of an assembly as well as in different uses of that assembly. For instance, an LMS can be hard in the sense that every person who uses it does so in the context of an architectural unit of a “course” but can be soft in its toolset (at least for a teacher). It can incorporate a rigid tool for assessment but a flexible tool for content creation. Furthermore, it is important to remember that we are dealing not just with pieces of an LMS but also with a bigger assembly, of which the LMS is only a part. Some people (especially students but often also teachers) will be required to use some of what might otherwise be softer parts because of externally applied rules, for instance, or because of its role in a course. There is a world of difference between a lecture that a student is forced to attend and the same lecture that an observer attends voluntarily: they are different assemblies, and different technologies, when we extend the boundaries to include all that are relevant (in this case, course regulations). There can also be less obvious boundaries to consider. For instance, if an online teacher wishes to engage students in debate, there might be only a single discussion forum available in the LMS provided. This is not particularly hard because (assuming that regulations allow it) the teacher might use a different system instead. However, that can demand plentiful counter-technologies (e.g., manual registration, protection from privacy violation, learning a new interface, etc.) and therefore effectively be no choice at all. Its hardness lies not in the LMS itself but in the teacher’s rigid adherence to the pedagogical method and perhaps the norms and expectations of the teaching role. They contribute to setting the boundaries that we must consider when identifying the technology of interest.
Soft Is Hard, Hard Is Easy
When a hard technology is used to replace something that a soft technology could do, one of its fundamental benefits is that it demands fewer decisions to be made by those who use it: in this sense (and only this sense), it makes things easier. Whether enacted in hardware or by humans, it requires less decision making because at least some of the thinking has already been done for us. In the process, we usually gain reliability, consistency, and replicability. It can still be difficult—mental arithmetic or correct interpretation of legislation, for example, is hard in every sense of the word—but there is only one correct way to do it. Although we will encounter exceptions and provisos as the chapter progresses, hard technologies are therefore highly amenable to automation, which can increase reliability and consistency and usually save time.
Softer technologies make things harder (more difficult) in the sense that they demand decision making and invention. The softest technology would be none at all, leaving its enactor to invent everything about it. This would be extremely difficult, and virtually impossible to label, because there would be nothing fixed about it. In real life, no such technology exists. To be able to call it a technology implies that there must be at least one or two hard phenomena (usually but not always or only other technologies) to orchestrate. Soft technologies fill gaps, not unlimited empty spaces. For example, it does take a lot of time and effort to develop the hard techniques for handwriting (holding a pen, forming letters, mastering spelling and punctuation, etc.), but once they have been learned we rarely need to think much about them in the future. It is not that it is trouble free—this is why it makes sense to harden that technology further, through technologies such as word processors, spell checkers, speech-recognition tools, and typewriters—but that we seldom have to think about it: we have created a machine in our minds that does the work. If, though, we are writing an essay, a poem, a book on learning and technology, or even a shopping list, then a different kind of difficulty emerges, because we need to orchestrate those hard skills, tools, and much else besides to create something that has never existed in the world.
Technologies do not necessarily simply replace things already done by people. Often they orchestrate things that would be difficult or impossible for humans to accomplish alone, such as providing the thrust needed to lift a rocket out of the Earth’s atmosphere, or calculating pi to a billion decimal places, or just moving a heavy rock with a lever. Making things easier, and/or making new things possible, are normally the reasons that we invent technologies. However, once they have been created, the same principles apply, regardless of whether a technology extends or improves what we can do unaided: a hard technology can increase the adjacent possible because it enables us to create and instantiate further soft technologies that incorporate the hard technology. It extends our boundaries, but within its own boundaries our roles (if any is left to play) can be fixed.
A soft technology can be enacted well, but it cannot be enacted correctly, because there is no single correct way of doing it. For the softest of technologies—for example, for painting or architecture—there can be no upper limit to what “well” means, no gold standard of measurement that can be applied consistently. We might recognize excellence but it will be impossible to say that it is as excellent as it could be.
Because they are closer to functional completeness, harder technologies are less flexible and less adaptable than softer technologies. They are also less open to change, less capable of evolving, less resilient to perturbation, more brittle. This is an inevitable and invariant trade-off built into the definition itself. If we make things too hard, then we take away the power of creativity, take away control, remove flexibility. But the solution is not therefore to make all technologies softer, because in doing so we introduce more potential for error, limit adjacent possibles to do more, and reduce efficiency.
Softening and Hardening through Assembly
Almost any hard technology can become part of a softer technology when assembled in the right way with appropriate methods and other phenomena (including other technologies). Even an archetypal hard technology such as an automated manufacturing machine can (say) provide warmth to dry clothes, or be used as steps to reach a light on the ceiling, and it might even make a serviceable bottle opener. It also takes little more than the application of a rule or rigidly proscribed method to turn even the softest of technologies into something much harder. Even natural movements such as walking can become a hard march or a dance that must be enacted precisely.
Of course, equally, we can replace one technology with another, softening or hardening the whole in the process, and we can make changes to parts of the assembly that will make it softer or harder, though it is important to note that simply softening or hardening one part does not necessarily affect the whole in the same way: the orchestration and the rest of the assembly usually play significant roles in this. For instance, the submission of coursework by email can be a soft technology for both the teacher and the students, allowing the teacher to accommodate individual circumstances, to cater to difficulties producing appropriate file types or sizes, or to forgive late submissions. Conversely, a hard equivalent—typical of many default LMS implementations of coursework submission systems—might prevent all those actions. However, the softness of email submission might well be overridden by hard institutional regulations or even something as simple as local restrictions on email.
Sometimes even disassembly can soften or harden a technology. Bricoleurs often take parts or even whole assemblies from one machine in order to build another, for instance, and many kinds of makeshift repair rely on disabling or removing non-working parts so that at least some functionality remains. Although this can be more restrictive and therefore harder, it can also be softer, as when a broken automatic controller is bypassed with a manual operation.
As soft systems grow softer by assembly then, as long as the additions do not restrict what was already possible, they can actually become less complete the more we add to them, each new addition increasing the adjacent possible, so they become more dependent on our creative input. A little like fractal figures that, as we zoom in to look at them in greater detail, turn out to be infinitely empty as well as infinitely full, the more we add to a soft system assembly, the greater the range of new and different options in addition to those already available and thus the further the technology moves from completeness. To a large extent, it is this dynamic that Kelly (2010) observes when he talks about “what technology wants”—the ever-expanding range of adjacent possibles drives technological evolution inexorably forward and to ever-greater complexity. Soft technologies are inherently dynamic and forward looking, always capable of change, always evolving, because with each actual comes new possibles. Think, for example, of the ways that we can build a model out of clay, in which each lump of clay opens new opportunities to place the next. This can be a curse if the need is for efficiency and focus on a problem. Sometimes, for instance, it is far better to use a restricted painting program than a full-blown installation of Photoshop because, unless we have a lot of the hard skills needed to make Photoshop do more complex things, there are too many possibilities with which to deal. However, for open problems that demand creative solutions, and in a world that constantly emerges and transforms in complex ways that are anything but designed, soft technologies can be very useful indeed. Although they often take second place in our imaginations to the flashier hard technologies that allow us to do things that we could not do before, soft technologies are at least as much engines of progress as their harder kin.
Softening through Automation
There are some possibly counterintuitive features of assembly. One particularly interesting example is that of automation. Although it is often demonized as a dehumanizing and hard technological pattern, and often plays that role when it replaces a formerly soft human process, there are many occasions when automation can actually soften a technology.
Twitter, for example, is soft because it is and can be many different things. One big reason for this is that one of its primary uses is as a connector to other resources, so it can become a critical part of a much larger assembly, adding social sharing to almost any web-connected technology. The restriction, for most of Twitter’s early history, that limited posts to 132 characters might be seen by some as a deliberate hardening, but that is to misunderstand the role of Twitter as part of the assembly of a larger technology. As Rose (2012, p. 206) explains, “it has enabled Twitter to achieve a significant paradox: maximum freedom through ultimate constraint.”
A big part of what makes it so flexible is that it does one small trick, like a stick or screwdriver or wheel, and like those technologies it needs other technologies, soft and/or hard, to make it complete, such as websites to display linked pages and images or user-defined mechanisms such as hashtags, abbreviations, and other processes to increase the meaning of the transmission. Its deliberate limitations are what make it so useful, because it embodies (or at least, when it was created, it embodied) one tiny, precisely delineated, but easily connectable tool that could be assembled with and into many other technologies. Unlike other technologies of the time that served similar essential purposes—for instance, social bookmarking systems—Twitter was soft enough to be aggregated in many ways with many different technologies and ways of working. Its lack of an obvious, well-defined purpose was its greatest strength.
Twitter’s subsequent evolution illustrates how automation can soften. For instance, the use of hashtags (e.g., #softtech) to classify subject matter into sets, the use of @ symbols (e.g., @jondron) to refer to people in networks, and even integrated hyperlinks were not part of Twitter’s original design. They started as conventions adopted by users of Twitter to turn it into a more useful technology for their particular needs, adding new functionality by inventing processes and methods aggregated by them with the tool itself (Johnson, 2010). These were both hard and human-enacted technologies: they were techniques that had to be performed with precision, or they would do nothing at all. They were prone to error, they were not understood by all who read them, and using them was a manual and not altogether trivial process, involving generic search tools to seek hashtags or @ references and scanning manually for results. Observing these patterns, the makers of Twitter subsequently automated these technologies, bringing efficiency and freedom from error—classic hallmarks of a hard technology. However, far from making Twitter more brittle or harder, this automation softened it further, because Twitter was aggregating them with the assembly, not replacing or subtracting any part of it. These additions opened new and interesting adjacent possibles (e.g., mining social nets or recommending and exploring tags). Crucially, the hardened parts took nothing away from what Twitter could do previously: users of it could ignore the new functionality if they so wished, without suffering anything worse than a few underlined links. Other features added to the Twitter ecosystem, such as photo and video sharing, have taken none of the system’s original flexibility away, despite automation, but added to the adjacent possibles of the system as well as made complex tasks simpler to perform. While Twitter has undergone many changes and its future is uncertain (as I write in 2023), these lessons have been applied in many other systems, most notably in functionally similar but federated applications like Mastodon and Bluesky.
There are many similar examples, from the automation of email attachments through embedded MIME (Multipart Internet Mail Extensions) enclosures replacing the manual use of uuencode and uudecode tools, to automated parking systems for cars, to the addition of electric motors to pedal bikes (assuming that pedaling remains an option). Notwithstanding a host of undesirable consequences—greater difficulties in maintenance, increased complexities in construction, greater ecological impacts, more expensive assemblies, and so on—automation, when it takes nothing of note away from the softer technology, often can result in increased softness rather than greater hardness. Those additional consequences are often significant, however, and typically demand the creation of counter-technologies to deal with them. Technologies remain, as Postman (2011) said, a Faustian bargain.
Hardening through Automation
Automation can often lead to more dehumanizing patterns, of the kind that Cooley (1987) rightly abhors when he calls for technologies of information rather than automation. As I write this in 2023, Twitter now filters top tweets by default, making them more prominent and thereby hardening the soft process of discovering interesting tweets. Although all tweets remain available to those willing to look for them (and those who know the extremely arcane spells needed to disable the automatic filtering and sorting), it is significantly more difficult to do so. Twitter thus partly dictates how people use it and removes some of the decisions that they formerly had to make, piping the “naturally” ordered list of tweets into a filter that, though not eliminating choice, makes some choices far less likely. We are all becoming increasingly familiar with the risks that such automation can bring, from filter bubbles (Pariser, 2011) to effects on election results of entire countries by actors bent on manipulating the algorithms to their own benefit.
The lessons of Twitter should not be lost on educators who seek to increase adjacent possibles. Just as the addition of well-chosen hard technologies softened the technology for its users, so too teachers (including autodidacts) can aggregate different technologies to support learning. From active hyperlinks in online presentations to uses of YouTube videos in classrooms, the possibilities opened up by automating parts of the pedagogical process are manifold. Moreover, they lead to new adjacent possibles—flexible paths for learners, integration of online and face-to-face activities, and so on—that create the potential for aggregation with new and different pedagogies that would be impossible had those steps not been taken. Equally importantly, it is not necessary to eschew the benefits of hard technologies in order to gain those of softer ones.
The key is in the assembly, not in the parts assembled. Building technologies out of small, well-defined, connectable, replaceable pieces is a powerful design pattern that brings with it the benefits of both the soft and the hard. However, as the increasing intrusiveness of Twitter’s sorting and filtering algorithm shows, it is easy to harden too much. Even when all that we do is make something a default, it can radically affect behaviour. In my own research (Dron, 2006), I discovered that 99.15% of over 6,000 courses in my institution’s LMS accepted its default landing page of course announcements. This presents a much harder and teacher-centric view of a course than, say, one that presents discussions or student blog posts first. Most of the exceptions that presented a softer perspective were my responsibility as either course or program leader. This was the case even though it easily could be changed with “only” a few clicks of a mouse button. Following up on these findings, I discovered that, even among the presumably computer-literate teachers of the Faculty of Information Technology, over 78% did not know that this could be changed, and on being informed of it over half said that they might change it in their courses, with over 15% saying that they definitely would do so.
My intervention that informed them of the possibility softened the technology for them, though no change occurred in the underlying platform. The power of defaults runs deep and broad. Most of us normally read books in sequential order of pages, most of us sit in chairs when they are provided. There are seldom rules that force us to do so, but in general there are what Gibson (1977) describes as “affordances” (what I prefer to think of as “propensities”), the likely ways in which we will interact with technologies thanks to features of their design. It typically takes effort and creative thinking to depart from defaults, and for the most part most of us have insufficient time, attention, energy, or desire to behave otherwise. As a computer programmer, I could, in principle, make a computer do anything that it is capable of doing, but it would be crazy for me to rewrite the operating system or build a new word processor.
Softness and Creativity
It is not a coincidence that all the technologies of arts and crafts of all kinds are inherently soft: they are about filling gaps. Similarly, it is a feature of the vast majority of social technologies that they are fundamentally at least fairly soft. In most circumstances, it would make no sense to automate dialogue and social engagement, though we can and do shape and channel behaviour in many ways to affect the forms and outcomes of social interaction, often with a clear purpose. For example, the StackExchange family of sites is built to provide reliable answers to questions through a process of dialogue, based upon the assumption that some answers will be more useful and reliable than others. It thus makes use of user upvotes and downvotes, as well as “karma” ratings to assess the reliability of those providing answers, in order to shape the dialogue visibly. Even when shaped this way, the system affords great flexibility, and the capacity for better answers to bubble up to the top is one of the major benefits of systems such as StackExchange, SlashDot (which uses multiple dimensions of ratings to indicate, say, humour, accuracy, and so on, as well as sophisticated mechanisms randomly to allocate temporary moderator roles to those with sufficient karma points, thus avoiding persistent power relationships), or Reddit (which uses simple star ratings that express only likes or dislikes but requires all Redditors to have earned their own ratings to give them to others). Social technologies (by their nature) allow people to communicate and thus negotiate processes and meanings, to add further parts to the assembly that allow people to change the rules, methods, and procedures.
Our softest technologies of all—such as language, writing, and computers—unfold into an infinitely rich range of new and enriching technologies and artifacts that bring usefulness and value to human lives. Unfortunately, the effort involved in their operation, while making them deeply human, also makes them slower and more prone to error compared with their harder cousins performing similar tasks (notwithstanding the fact that they can become or be part of harder technologies). However, leaving aside the manufactured ready-made objects beloved by Warhol or Duchamp (where it is not the object itself but the concepts with which it is assembled that make the work), I would normally prefer a portrait painted by a three-year-old child than the perfect lines of an automatic drawing machine that uses a photo as its basis. Some things simply should not be hardened because they are ways of expressing and communicating our individual creativity and invention. These are things that define and fulfill us as human beings. In art, we do not need nor should we normally seek perfection from the point of view of the viewer, reader, or listener. It is precisely because of individual interpretation and invention that artworks have value, so, if we take that away and harden it, then there is nothing of any value left. Film, poetry, music, painting, sculpture, or fiction leaves spaces to be filled by the viewer, listener, or reader, a notion taken to its extreme in John Cage’s 4′33″, (almost) nothing but silence, and that is the point. It requires listeners to pay attention to the sounds around them, to orchestrate their own experiences. The techniques and inventions that fill the gaps in soft technologies also fill the gaps between us.
Soft technologies are innately accommodating of diversity: because they are open to the future, they can play out in myriad ways. Softer technologies can have infinitely many uses in an infinite number of technologies. But, as always, it is important to remember that all soft technologies involve at least some hard technologies, that softness can be achieved by assembling harder technologies, and that, in many cases, those harder technologies in the assembly are what make creativity possible in the first place.
Hardness and Creativity
Soft technologies are innately rich in creative potential, whereas hardness by definition provides none. However, hardness is no barrier to creativity as long as it is part of an assembly that is or can become softer. Almost all soft technologies have harder elements integral to their assemblies, whether they be techniques, physical or virtual structures, or rules.
Sometimes, even as they reduce freedom, harder technologies can provide boundaries and obstacles that act as stimuli to creativity (Boden, 1995). If technologies were entirely restrictive, dominative, and prescriptive, then we would have no means to be creative, but creativity can emerge whenever there are gaps that can or must be filled. More often than not, we find ways of assembling hard technologies with other technologies to make them softer, a creative process that might not occur if the technologies were overly soft in the first place.
For instance, a teacher faced with the need to fill an hour of time allocated to a lesson, with a predetermined curriculum that needs to be addressed, might find it easier to do than to imagine how learning can happen with no constraints, and such constraints certainly will help learners to focus on goals, and means of achieving them, no matter what happens in that allotted slot. We are finite beings with finite attention spans, and constraints, up to a point, can help us to structure our thinking. Of course, we have limitations that vary considerably according to task and context. Although it might be useful sometimes, the requirement to fill an hour (no more, no less) is an almost completely arbitrary constraint that often can be the opposite of liberating.
When technologies are too soft, we have nothing to kick against, no reason to choose between a potential infinity of options. Too many choices are as bad as no choice at all (Schwartz, 2004). An excess of softness is what causes the tyranny of the blank page as much as it leads to the increasing challenges of information overload that modern networked societies face. Even the technologies involving the stick on the ground have harder parts—methods, techniques, rules, and so on—that, though flexible, provide some level of structure and replicability.
As Brown (2009, loc. 233) puts it, “without constraints design cannot happen.” The principle of the adjacent possible is not just an opening out of opportunities but also a channelling, creating an ever-growing supporting structure of foundations upon which to build, a pattern of path dependencies that, as we have seen, can play a hard and structural role, though what led to them in the first place was anything but.
Although a certain amount of constraint can support creativity, when it gets in the way and prevents us from doing what we would like to do, a hard technology can become positively harmful. It acts as an obstacle, an authoritarian channel that determines what we can and cannot do whether we like it or not.
The worst hard technologies are not only restrictive in themselves but also demand high levels of dehumanizing skill to operate them. They are dehumanizing because they entail the loss of free will in determining how they will operate. Implicit in this behaviour is the fact that, in bowing to the will of the machine, people are bowing to its creators, owners, or managers. This is the kind of technology that many—such as Ellul (1970), Franklin (1999), Mumford (1934), and Norman (1993)—rightly despise. Such technologies are often associated with inequalities and power relationships because what is automated is often for the benefit of the creator or owner of the technology rather than the person who must become a part of it. A production line is for the use of its owners and shareholders, not for its producers who enact the technology. The orchestration, the phenomena, and the use are all for someone else, so humans are nothing but parts in the machine, providers of phenomena orchestrated and assembled by the machine’s owners to achieve their own purposes.
The creators, owners, or users, though, can be us. We use countless hard technologies for our individual or social benefit. Egg timers, practice regimes, meditation rituals, telephone numbers, and computer backup applications are at least as hard as any factory or bureaucratic system, yet they appear mainly to benefit their end users. As always, perspective matters: the issue here is not so much whether a technology is softer or harder but who controls its use and whose purposes it serves. Often such control can be shared. For example, though I may choose to use my fitness watch and benefit from the control that it gives me over my exercise regime, its creators can impose ways of using it on me (e.g., nagging reminders to jog) that I might not want or even loathe (I do). Hardness is a continuum, not an absolute binary distinction.
It is critical to understand that prescriptive and dominative hardness matters only to an individual whose choices are forcibly limited by the hard technology and that this is not necessarily a feature of the specific technology itself but of the socio-technical context in which it is applied: of the whole technology and (most significantly) the use to which it is put, not just the parts of the assembly around which we choose to place our boundaries. More often than not, the problematic aspects of any given technology are the rules, norms, and constraints overlaid on the parts, not the parts themselves. What makes examination systems or classroom attendance requirements hard is the fact that they are assembled with further technologies—sets of rules—demanding that their users obey them, with significant penalties for those who do not comply. There is nothing wrong with any of these things if they are freely chosen by people who must play their fixed roles, with nothing further riding on them. Many of us enjoy taking quizzes and tests of our competence when it is our choice to take them and nothing much depends on our success. There is a vast industry of quiz books, sudoku, crosswords, jigsaw puzzles, and so on that gives great pleasure to many. They are parts of different assemblies, with different boundaries, than the same quizzes and tests used to judge other people. Boundaries really do matter.
For those who have control of them, hard technologies can do a great deal of good. For instance, automated light rail transit and personal accounting systems typically are fairly hard technologies for end users, who have to play well-defined and invariant roles that can liberate far more than they inhibit because the end users can choose whether or not they have value and whether or not or when to use and participate in them. They are part of an assembly that, as it grows, becomes as soft or as hard as needed. It would be unwise to underestimate the value of delegating control to someone or something else in order to free ourselves to have more control, more capabilities, more options, greater comfort, greater safety, greater convenience. We harden technologies for good reasons most of the time, and, as long as we are aware of and not required to conform to their demands, they do much good.
Indeed, when a system is too soft, we tend to create our own boundaries to give us something to hang our ideas on or to kick against. For instance, to overcome the tyranny of the blank page, we might use ritual boundaries, such as introductions, conclusions, or the accepted formal structures and phrases used in letter writing to help give a form to our writing. As long as the signals that emerge through the boundaries have value to us, boundaries are essential to creation and critical in enabling us to function in our environments. Without boundaries, nothing would exist, or, if it did, then there would be no means to distinguish one part of it from another.
Technique, Soft and Hard
We can now examine more closely the notion of technique, first discussed in Chapter 3. Techniques, the ways in which things are done by people, invariably have some harder elements that might be described equally as methods or procedures. There are techniques for strumming guitars, for drawing, or for developing photographs, and all of them refer to the hard roles that we play as part of a hard technology: they are methods that can be codified, mechanized, repeated. However, as alluded to earlier, there is also an idiosyncratic element to virtually all techniques: even on and off buttons can be pressed in different ways (though it might not make a lot of difference to the operation of the technology of interest). Although we might talk of “perfecting” our technique, the reality is that it is often impossible and, as we will see in detail in Chapter 8, might be positively undesirable. It is highly unlikely that Leonardo da Vinci, gazing at the finished Mona Lisa, believed that his painting was an example of technique that could be improved no further. And, of course, it was not, no matter what we might think of the painting as a whole. And, though we might be deeply impressed with the hard skill of a photo-realist artist in producing paintings or drawings indistinguishable from photographs, normally the compositions and contents rather than the fine attention to detail move us.
Most of us would rather view the wild, organic, chance-filled brush strokes of Vincent van Gogh than a painstaking replica of the same subject traced from a camera obscura. Technique, in softer technologies, is infinitely or at least indefinitely malleable. It is always capable of refinement, and it is always capable of reinterpretation and re-evaluation in the light of the ever-unfolding adjacent possible. Soft technique is not quite the same as creativity, though it might be an engine that drives it. Soft technique is often born from imperfections and inadvertent mistakes. They can then become discoveries that we can use to ends that perhaps we did not seek, but that we find in ourselves and our creations, from which subsequently we can build new creations.
This is as true of teaching as it is of fine art. We can learn and refine harder techniques—how to pace a lesson, how to sequence activities, and so on—but how we teach, perhaps more importantly, is an ever-unfolding result of how we react to and use the “imperfections” in our enactment of those techniques, the ways that we adapt as we learn them. We make (often unintentional) variations into something of our own, in constant and never-ending conversations with what we do and what we learn from what we have done. More than anything, these idiosyncratic habits, these deviations from a described method, lift our technologies from realms of the predictable to realms of the human, the situated, the individual, the complex, and—when it works—the beautiful and divine. Without soft technique, there could be no art, no meaning, no communication, no creativity, no progress beyond that of the slow march of evolution. Without it, we would not be recognizably human, and there would be no meaning. We will return to this notion in more detail in Chapter 8.
Baby Bear’s Bed
Hardness brings efficiency, ease of use, scalability, and freedom from error, whereas softness supports creativity, flexibility, and diversity. Both are necessary for different reasons, and virtually all of our technologies are a rich and complex mix of soft and hard, with few at either extreme. What matters is not whether a given technology is hard or soft but whether it is sufficiently hard or soft for the case in question and at the boundaries that we choose to consider.
This is true as much of pedagogies as it is of other technologies. Harder pedagogies—more prescriptive ones—can be extremely useful. Prescriptive scripts, for instance, can be helpful to temporary teachers or beginners, providing a recipe or pattern that can be followed until they are sufficiently adept at designing the process themselves. Pedagogies that are hard for learners (dictating methods and processes that must be followed) can be the most effective ways of learning some hard skills. Spaced practice, interleaving (which implies at least some spacing), and other hard, repetitive approaches to learning, for example, can provide essential foundations for further learning. Moreover, at least some degree of hardness is usually a good idea when encountering something truly novel: offering choices makes no sense unless the person making those choices has enough knowledge and skill to make them (Garrison & Baynton, 1987), so it is worthwhile for beginners to delegate control of the process to someone or something else until they have those skills, as long as they are free to regain control at will. Softer pedagogies, those that provide only rough guidance and principles, whether they are soft for teachers or learners or both, better allow for adaptation to learners’ needs and creative and divergent approaches. They provide learners a sense of being in control, which can greatly aid motivation (Ryan & Deci, 2017). Yet we should always remember that soft is hard: the more freedoms are required (not just allowed), the more effort and thought are needed to choose what to do. Choosing which parts to harden and which parts to soften is one of the key activities of teaching.
A blend of hard and soft is almost always not just desirable but also necessary. For example, gamification (in the proper sense of the application of game-inspired approaches to learning, not the pointsification that mars too much of the genre) often involves some hard processes, from the point of view of the learner, including the requirements to follow rules, to aim for fairly rigid and unambiguous goals, and to submit to a great deal of teacher/designer control of the process. However, done right, it offers great softness in places that traditional education makes hard. In particular, effective gamification almost always makes a virtue of failure, allowing learners to try and try again until they succeed, to experiment with different approaches each time around, and to develop hard techniques through repeated practice in multiple, variegated contexts that allow them to build competence at a pace that suits them, without fear of judgment.
The only time that a technology is too hard is when it prevents us from doing what we want to do, when it curtails our freedom at points that matter to us. Prescriptive, dominative technologies are often bad for this reason: they limit our capacity to act as independent, creative human beings. Sometimes this can be insidious, such as the use of leading defaults, or the appearance of control afforded by algorithmically filtered search results, a problem to which we must remain alert.
The only times that a technology is too soft are when it makes things too complex, difficult, slow, inefficient, or prone to error. This can be equally constraining and at least as harmful as too much hardness. Many learners feel disempowered when faced with choices that they have insufficient knowledge or skill to make, to the point that they might give up or be put off learning something for life.
In any given context, for a particular individual or group, from a particular perspective, at a particular scale, there will be a perfect sweet spot. Like Baby Bear’s bed in the story of “Goldilocks and the Three Bears,” the sweet spot is not too hard, not too soft; it is just right. What is just right will always vary according to context, purpose, and individual needs or wishes. This is the essence of why teaching, of necessity, is a soft technology. It is about building the right assembly, with an effective orchestration of the correct phenomena in order to create a learning experience not too hard, not too soft, but just right.
Summary
This chapter has wound around some of the complexities of our intimate participatory relationships with technologies and the relative merits of performing our own orchestration or allowing parts of our technologies to be orchestrated for us. Almost all technologies are somewhere on a spectrum between soft and hard. Table 2 shows some dichotomies that characterize some of the commonly seen features of each, any or all of which can be found in a technology as it is instantiated.
However, points of view and the overall orchestrated assembly around which the boundaries should be set can affect deeply how we view their pliability, so this should be taken only as a rough guide for identifying relative softness or hardness, a means to establish rules of thumb rather than hard and fast laws. These are not definitional features so much as aspects that help to identify technologies as more part of one family than the other, in accordance with Wittgenstein’s (2001) use of the term “family resemblance” (Familienähnlichkeit).
Soft pattern | Hard pattern |
---|---|
Aggregation | Replacement |
Signposts | Fenceposts |
Freedom | Constraint |
Flexibility | Efficiency |
Bricolage | Engineering |
Networks | Hierarchies |
Open | Closed |
Creators | Users |
Distributed | Monolithic |
Dialogue | Structure |
Complex | Complicated |
Searching | Filtering |
Pliable | Reliable |
Irregular | Regular |
The hard–soft spectrum allows us to view our learning technologies, from pedagogies to international education systems, in a different light. First, it emphasizes the role of the human participant, too often ignored. Second, it allows us to define more clearly a technology in any given instance as something highly situated, rather than as a generic label, where the boundaries that matter extend beyond the most obvious components and tools to include all aspects of the assembly. And third, it makes it easier to understand education as a highly distributed (not just decentralized), collective endeavour—as a gestalt that is emergent and greater perhaps, but certainly different from, the sum of its parts—that teaches us and of which we all are parts. With this in mind, in the second part of the book, I use the participatory model to help examine and explain why this matters to how we learn, and I extend it to explore how we are not just participants but also, in any learning context, co-participants in the construction of technologies.
We use cookies to analyze our traffic. Please decide if you are willing to accept cookies from our website. You can change this setting anytime in Privacy Settings.