I’d often work with colleagues in schools to outline a “portrait of a graduate.” The goal of this exercise is to lay out what students should know and be able to do by the time they graduate. Ideally, schools work backwards from these outcomes to map the curriculum and shape teaching and learning initiatives.
A lot has changed since I engaged in this exercise. The concept of students as lifelong learners is no longer just a concept. It’s a reality. Technology’s impact on the job market – eliminating jobs high school and college graduates once took for granted – means the notion of a student as someone who engages in formal learning in traditional learning organizations (aka colleges and universities) for a finite period of time is also extinct.
Our young people are learners. In addition to subject-specific skills and concepts, our learners must be self-directed. We need to teach them how to gain these skills by helping them practice.
Of all the ways self-directed learning is defined, I choose Malcolm Knowles‘ definition from his book, Self-Directed Learning: A Guide for Learners and Teachers (1975). He asserts that self-directed learning (p. 18)
describes the process in which individuals take the initiative, with or without the help of others, in diagnosing their learning needs, formulating learning goals, identifying human and material resources for learning, choosing and implementing appropriate learning strategies, and evaluating learning outcomes.
To achieve these goals, the authors of How Learning Works (2010) and the authors of Helping Students Learn in a Learner-Centered Environment (2008) emphasize the importance of helping students develop metacognitive skills, like thinking about thinking, knowing what we know and what we don’t know, and self-monitoring their learning.
When we talk about self-directed learning, we’re still discussing it as an if or a when. I disagree. It’s a now.
What do you think?