Given how much I read, I wish I was more like my friend Rob who has the insane ability to remember just about everything he reads down to where it’s located on the paper or digital page he reads it on. Unfortunately for me, that’s not the case. It takes much longer for the patterns and themes to congeal when the texts in my head are shadowy fragments of themselves as they appeared in the real world. The good news (for me at least) is that every now and again, I recognize the patterns, though it’s a process that sometimes feels like completing a puzzle at a molecular level.
One such theme that seems to be emerging for me is that for all of our flagellation over the skills that our kids will need to thrive in the future, we may be missing the most important yet eternal skill of all: knowing thyself.
Christian Talbot over at Basecamp used that phrase in a short post this week about the “School of the Future.” In it, he glosses the Yural Harari book “21 Lessons for the 21st Century,” one that I’ve been telling everyone in education they need to read (with the caveat that an adult beverage should be near at hand when they do so.) While Christian snips a short piece from the book, I want to share a longer chunk that speaks to the idea that it will become increasingly difficult for us (and our kids) to tell the difference between who we really are and who the marketing algorithms say we are:
To succeed at such a daunting task, you will need to work very hard at getting to know your operating system better—to know what you are and what you want from life. This is, of course, the oldest advice in the book: know thyself. For thousands of years philosophers and prophets have urged people to know themselves. But this advice was never more urgent than in the twenty-first century, because unlike in the days of Laozi or Socrates, now you have serious competition. Coca-Cola, Amazon, Baidu, and the government are all racing to hack you. Not your smartphone, not your computer, and not your bank account; they are in a race to hack you and your organic operating system. You might have heard that we are living in the era of hacking computers, but that’s not even half the truth. In fact, we are living in the era of hacking humans.
The algorithms are watching you right now. They are watching where you go, what you buy, whom you meet. Soon they will monitor all your steps, all your breaths, all your heartbeats. They are relying on Big Data and machine learning to get to know you better and better. And once these algorithms know you better than you know yourself, they can control and manipulate you, and you won’t be able to do much about it. You will live in the matrix, or in The Truman Show. In the end, it’s a simple empirical matter: if the algorithms indeed understand what’s happening within you better than you understand it yourself, authority will shift to them (272).
Maintaining our Human Autonomy
Granted, this is a fairly dystopian view of the future, but it feels accurate. Especially when that sentiment is echoed by others who I’ve read over the years whose thinking resonates. One of those is Douglas Rushkoff, who has been in my brain ever since he wrote Media Virus back in 2010. Last week, Nomad posted an interview with Rushkoff that provided another piece to my puzzle:
Q: So you say that it’s essential that we maintain our human autonomy in this digital age. But how do we keep human sovereignty both now, and also in the future? How can we avoid becoming nothing more than slaves in the digital age, and instead use the digital world to help us?
Rushkoff: Well, I think the first step is being conscious. For people to be aware when they’re using a digital platform, and to be aware that the platform has been programmed by people and companies with very specific agendas in mind. It’s not a conspiracy theory at all. It’s just saying that the tools that we’re using were made by people who want something whether they want us to be a subscriber or to be dependent on the tool, or to engage with other people in specific ways. The tools will encourage certain kinds of behaviour and discourage other kinds. So we need to be aware of what the tools we’re using are actually for before we decide to use them. If you look at something as simple as Facebook, you think what is Facebook, above and beyond what I am using it for? The platform is there to extract data. Mainly consumer and political data about me.
The kinds of behaviours it is going to encourage are data-rich behaviours, behaviours that have to do with my consumer choices or my political choices, different things about my lifestyle that can be monetised as categories that matter to them. But the sorts of behaviours that say: Get me to connect with people in the real world; the sorts of behaviours that push me offline, that encourage me to keep certain things about myself private or limited to my intimate friends those will be discouraged, because they don’t fit the business plan of the platform. So I think that’s really the main thing. If people understand what the technologies they’re using are for, then they’ll be less likely to be used by them. They are more likely to choose to use them for purposes that are aligned with the purpose of the platform, rather than trying to get love, or satisfaction, or a sense of self, or genuine social connections through technologies that aren’t built to do that.
The thread here, for me at least, is that if we don’t have a clear understanding of who we are and how we interact with the world around us at the moment we will be losing an important battle for what it means to be human, to maintain our agency in a world that is increasingly trying to manipulate us through the technologies we use. As Rushkoff says, that’s not the fault of the technology as much as it’s our own blindness to the agendas of the technologies and to the consequences of ceding our agency to them. In other words, this is a choice we make.
That choice making is, obviously, much of what makes us human. We can even choose to choose, or we can choose not to choose. And it’s that second part that is at the center of the puzzle for me. We seem increasingly willing to let others, people or algorithms, make choices for us. To me, that’s less about technology and more about disposition. We seem increasingly disposed to let others’ choices determine our lives. Maybe because it’s easier, because it requires less bandwidth, or because we’re too scared to make the wrong choice.
Helping Students Know Themselves
Which brings me to this ongoing conversation we have on this blog about education and learning and teaching and classrooms. And it leads me to wonder if we’re doing everything we can to help students be human in their own skins, to know themselves and understand the power they have to choose and the contexts they need to make the best choices possible. I don’t think there is any question that by stripping students of their ability to choose much of anything of consequence during their time in school, we’re leaving them more prone to let others choose for them in their out of school lives. By not privileging their agency (and here I mean much more than giving them “voice and choice,”) we’re subverting the “human autonomy” that they will need in an increasingly less human world.
As Harari points out, “knowing thyself” isn’t a new concept. It’s been the advice of prophets and philosophers for centuries. And it’s not a new idea in education conversations either. And so, another part of this puzzle for me has come from John Holt, who as I dive more and more into his work has made our work to reimagine schools even more urgent for me. In the provocatively titled book Instead of Education which was published over 40 years ago now, Holt makes the case that schools are, if anything, a barrier to children knowing themselves:
Next to the right to life itself, the most fundamental of all human rights is the right to control our own minds and thoughts. That means, the right to decide for ourselves how we will explore the world around us, think about our own and other persons’ experiences, and find and make the meaning of our own lives. Whoever takes that right away from us, as the educators do, attacks the very center of our being and does us a most profound and lasting injury. He tells us, in effect, that we cannot be trusted even to think, that for all our lives we must depend on others to tell us the meaning of our world and our lives, and that any meaning we may make for ourselves, out of our own experience, has no value (4).
If you read Holt, especially in this book, you know that he finds little to like in formal education. For Holt, an education borders on evil in the sense that it strips a child of her inherent agency and, in turn, her inherent human-ness. I think we’ve made some strides in the right direction since Holt’s time, but it’s undeniable that we want kids to bend to our definition of what it means to be human, and part of that requires that we control the process, that we make most of their choices for them. That does not serve them given what the world has become and is becoming.
Seymour Papert famously asked “Does the child program the computer or does the computer program the child?” I’d tweak that a bit: “Does the school program the child, or does the child program the school?” Meaning, of course, does the child have choice and agency over what happens while in school. If Harari and Rushkoff are right, if truly thriving in whatever world evolves means being fully conscious of who we are as humans and exercising our ability to choose for ourselves based on our understanding of who we are and what’s best for us, we should focus all of our efforts on building our students capacity to be fully human.
FREE WHITEPAPER: 10 Principles of Schools of Modern Learning
Ready to make relevant, sustainable change in your school? In this free whitepaper, we give you a framework, insights, action steps and links to curated resources for developing kids who are deep, powerful learners. These 10 principles offer a guide to creating real change in schools.