Technique 21: Take the steps - Lesson structures

Teach like a champion 3.0: 63 techniques that put students on the path to college - Lemov Doug 2021

Technique 21: Take the steps
Lesson structures

Strong content knowledge is essential to excellence in teaching but it also has a downside. Expertise carries with it what some cognitive scientists call it the “curse of knowledge:”2 It's hard for experts to understand why things are hard for novices. What's obvious to us is not obvious to students.

Consider: You flash a slide showing cell structures and organelles like the following illustration on the screen and begin explaining it to your class:

An illustration shows the anatomy of an animal cell.

As you talk you hold in your mind a thousand implicit understandings: That the cellular organisms are enlarged relative to the size of the cell in the image to make them more visible. That the colors are for the purpose of demonstration, and aren't realistic. That actual cells are three-dimensional. That animal cells feature huge variety in their shapes, and few if any animal cells are actually this shape. That the images of the organelles are supposed to represent cross-sections. That the mitochondria aren't in fact all the same size. That the organelles are always moving and changing. And on and on.

At best, your students might vaguely understand a few of these things. More likely they'd know none of them. But you wouldn't think to mention them in your explanation of the diagram because what they don't understand is hidden from you. For all intents and purposes, you and your students are looking at different diagrams.3 What's obvious to an expert is hidden to a novice, even in terms of perception.

In fact I generated my list of what an expert understands about a cell diagram with the help of dozens of colleagues on Twitter.4 What was obvious to a biology teacher and not to students? I asked. Most people contributed one or two of these ideas. Nobody saw all of them, and there were more suggestions than I could keep track of.

Combine this with the Dunning-Kruger Effect, the tendency of unskilled individuals to overestimate how much they know, and you have “an unwitting conspiracy,” writes Greg Ashman. “The teacher thinks their students understand and the students think they understand, and they don't understand.”

There's no reliable “cure” for the curse of knowledge. Perceiving subtle hints of confusion in your students' countenances can suggest that maybe you've overlooked something, so prepare your lessons well and keep your working memory free to observe students. Experience also helps: You develop intuitions about students and their misunderstandings from years of hearing them say wonderful things like, “That can't be the mitochondria—it's not orange!” during a lab. But it will always plague us. Still, there are two things you can do in terms of teaching methods that make you more likely to instill successful understandings in students' minds and to avoid massive misunderstandings or simple failures of learning.

The first is to break up new material into steps and teach and practice them sequentially. The second is the use of annotated models or work samples instead of rubrics to help students develop an understanding of the parts of a complex task.

Perhaps consciousness, too, is curative. Teachers should always be on the lookout for any points they may have overlooked, which may be obvious to them but a source of confusion to students. You have listed the hard vocabulary words in the chapter and taught them, but there are some simpler words that confuse your students. You understand that two events happened a hundred years apart but your students lack your chronology and think they happened at roughly the same time. What exactly the points of confusion will turn out to be remains a mystery. That they exist is a very good bet.

The guidance fading effect

Novices learn differently from experts. This is one of the most important findings of educational psychologists and, as far as I can tell, one of the least recognized by educational theorists. The tendency of novices to benefit from more direct guidance and experts to get more from problem-solving situations, cognitive psychologist John Sweller calls the guidance fading effect. “Students should initially be given lots of explicit guidance to reduce their working memory load, which assists in transferring knowledge to long-term memory,” he writes. “Once students are more knowledgeable, that guidance is unnecessary and … should be faded out and replaced by problem-solving.” This is both useful and unusual guidance. (Unusual because Sweller does not propose one inherently better methodology for every situation. He seems to have heeded Dylan Wiliam's admonition “Everything works somewhere and nothing works everywhere.”)

Open-ended problem-solving activities play an important role. So does direct instruction. The answer to the question of which to use is: It depends. But what it depends on most, Sweller tells us, is the degree of knowledge about the topic among students, and here it is important to recognize that outside of a university setting students are most often novices and, further, that people go back and forth between novice and more knowledgeable states frequently, even within a subject. On Friday, at the end of a well-taught and organized unit, your students may be fairly knowledgeable; then they start a new unit on Monday and move back to square one again. This would imply different teaching techniques for each day.

Consider an experience one of my own children had building a model rocket in Science class. One day my daughter announced with no small degree of excitement that they were going to be building and flying rockets in class later that week. “Great,” I said, “Are you studying air resistance? Or aerodynamics?” That sounded super-geeky, so I rephrased: “You know, the sort of things that might make a rocket fly better.”

“I'm not really sure,” she said, “We haven't yet. I think it's the introduction to a new unit.”

A few days later I asked her how the rocketry lesson went. “Great,” she said. They had made the rockets out of paper and gone out to the back fields to fly them. “Our team won!” she told me breathlessly—meaning that her group's rocket had stayed aloft the longest. “Double-cool,” I said, “What made your rocket work so well?” I asked.

“I'm not really sure,” she said. “I think maybe our wings. They looked different from other people's.”

“Oh,” I said. What had she and her teammates tried to do with them? How were they different?

Silence.

If their wings had been better, in other words, it had been a lucky guess, which was fun and memorable but not all that instructive. They hadn't been testing an idea—“Hey, since we know X, let's see if …” It wasn't an application of knowledge, in other words.

I should be clear: I liked the rocket activity. It was fun and inspiring and typical of why my daughter loves science. But it would have taught her more to do it after she was no longer a rank novice. A few lessons on the principles of air resistance followed by some small experiments with different wing shapes would have helped. Or, to capitalize on the rockets as a “hook,”5 you might fire them twice—once to make some conjectures at the beginning, and then again after students know enough to cement their knowledge and ask: “What do you notice now that you didn't know when we first fired the rockets?”

If students have knowledge they can use it to describe, explain, and perceive why some rockets fly better than others:

We had larger wings with more surface area or

Maybe it was the size of the nose cone that created less air resistance.

The value of hands-on learning, in other words, correlates to how much students know when they engage in it. To use hands-on activities before you've shared knowledge overlooks the differences between how experts and novices learn.

Watch the video Michael Towne: Red Dye from Stretch It (technique 17) again and notice how, as Michael drops dye into water of two different temperatures, his students apply knowledge that he has taught them. They are able to see what's happening because they have the prior knowledge to do so.

“Present new material in small steps with student practice after each step,” advises Barak Rosenshine in Principles of Instruction, his guide to teacher effectiveness. This takes time but is worth it. It makes the implicit explicit for students and uses practice to consolidate understanding before working memory is overloaded. This, cognitive psychologists suggest, is the way students learn best when they are not experts. “Teaching in small steps requires time, and the more effective teachers [in studies] spent more time presenting new material and guiding student practice than did the less effective teachers,” Rosenshine notes. “In a study of mathematics instruction, for instance, the most effective mathematics teachers spent about twenty-three minutes of a forty-minute period in lecture, demonstration, questioning, and working examples.” In contrast, less effective teachers spent only eleven minutes methodically explaining and practicing new material, Rosenshine notes.

Working memory is quickly overloaded by too much new material at once. Solution: Break down what you're doing into steps. Take the steps one by one. Let students practice each one. As they practice, ask questions of two different types: conceptual (what are we doing?) and procedural questions (how do we do it?), going back and forth between the two when possible.

The video Rachel Boothman: We Are Solving provides an excellent model of Rosenshine's idea of methodically explaining and practicing new material as well as the idea of going back and forth between procedural and conceptual questions.

“When we are solving it means we are finding the value of the letter. So we want x on its own and we have solved the equation,” Rachel begins, reminding her class of a key definition. Then she asks a procedural question to start: “What's my first step here?”

Brandon answers and Rachel asks a conceptual question: “What is the mathematical name for what Brandon has just done?”

Moments later she's back to procedural questions, Cold Calling Ella and then Arian to ask, “What do I do next?”

“OK, this time we are simplifying,” she says as she moves on to a new problem. Now she starts conceptually: “Who can tell me another word for simplifying?” She gets a correct answer, finding like terms, so asks Karis, “So here where are our like terms?

Karis has left off the negative sign and Rachel asks, “Can anyone improve on Karis's answer?” But of course there are multiple sets of like terms so she asks Azaria for another set. After these conceptual questions—What are like terms? How can I spot them?—she goes back to procedural questions, asking Hannah to add —3x and 5x and Ellis whether she can simplify any further. When he answers correctly, she asks a conceptual question: “What do I mean when I say index? What's the index here?”

Later, after students have mastered concepts, a less methodical approach might well work, but remember this caution: Our own memories of our experiences as learners sometimes work against us. We remember moments that were unforgettable to us later in our studies. They were unstructured. We were granted the opportunity to explore. Suddenly: an epiphany. Perhaps you still remember walking out of some building on campus after a class in your major with your mind exploding. Surely if you re-created this activity for your students it would be equally profound. Surely it would add rigor to your class. The challenge is that you were an expert when you explored your way to this insight. You were perceiving far more meaning in each interaction than a novice would. Your students will eventually get to a place where they, too, will derive profound benefit from the open-ended exercise. But until they know more, they are likely to get less out of the experience than you did, and adding it to your lesson won't necessarily make your class equally profound and memorable to your students.

Work samples versus rubrics

The message so far: Introduce new content in discrete steps, interleaved with practice and reflection to allow students to manage the load on working memory and gradually begin encoding ideas in long-term memory. But what about content that can't be easily broken into sequential steps? Writing an essay or painting a watercolor, say. They are still tasks that involve mastery of separate elements, but they cannot be easily broken into steps.

One solution is to share an annotated exemplar marked up by the teacher for students to study. In Embedding Formative Assessment, Dylan Wiliam and Siobhan Leahy suggest replacing the commonly used rubrics with “work samples” as tools to explain how to complete tasks successfully. This is interesting in part because rubrics, carefully crafted but relatively abstract descriptions of the characteristics of high-quality work, are full of things that are obvious to experts and hidden to novices. They “rarely have the same meaning for students that they do for teachers,” Wiliam and Leahy note. For example, a common rubric might note that a proficient essay “uses words and phrases, telling details, and sensory language to convey a vivid picture of the experiences, events, settings, and/or characters.” But what does it mean to use telling details and sensory language if you are a novice, and how do you determine their quality? An “inadequate” essay, according to the rubric, “merely tells experiences, events, settings, and/or characters.” So the difference is that a better essay uses sensory language and is vivid. Do students know what vivid language looks like? Do they know if their language is sensory? What would it mean in their minds to write in sensory language? Would it make the essay better if the author suddenly dropped in details about things he or she smelled and tasted? It might just as likely sound absurd.

Instead, Wiliam and Leahy suggest looking at high-quality examples that include explanations of how key elements were created. Such a work might point out several passages that included vivid language or sensory detail. With a series of concrete examples, a novice now might really understand what it looks like and see how to embed it in a larger work. How are sensory details effectively included so they are not just greater in quantity but better in quality?6 Essentially, you are building a mental model for students so they can imagine and understand the component parts of the end product.

The idea that the starting point for an assignment could be public work sampling—rather than abstract rubrics—is compelling. The key is to make the work samples concrete—to guide students through three or four examples of key elements, and then allow them to practice them discretely.7 This might result in several shorter exercises—studies—where students seek to master various parts, or solve smaller problems before composing a larger work. This is in fact how many of the great masters worked. Their canvases are the product of study after study in which solutions to the painting’s component problems are modeled. Figure out the light and shadow on the house in the foreground first. Then sketch the figures in the field and explore how you'll use shading. Several times, perhaps. Then put them together.

An exceptional example of this was provided to me by Nina Troiano, then an Art teacher at Troy Prep Middle School in Troy, New York. I was always stunned by the quality of student work in the hallways at the school and observed several of Nina's lessons. Typically she would build skills and knowledge progressively through a series of exercises leading up to a final piece. Here's her process for breaking down the process of painting formation as she described one project to me.

“It's a pretty academic first day,” she said. “We started by reading about the artist in the Do Now—maybe two paragraphs with questions to answer. We discussed her approach and then we looked at one of her drawings.

Cartoon illustration shows a house.

“We talked about why she did what she did and we were pulling out elements we'd use in our projects through guided discussion,” Nina said.

“I was trying to allow the kids to give their own spin on what I think was a really eerie approach to landscape.8 We talked about how the house was isolated, and that there were no people. But also that there were strange elements like a ladder up to a window that made you realize there might have been activity at one point. It made you realize there were no people but there should be. We talked about how the houses and trees were stylized and the houses were drawn in two-point perspective, but a distorted imperfect version of that.”

Here Nina was essentially annotating an exemplar, telling her students: Here is a model of a possible finished work. Here are some of the key elements that make it powerful. Here are words to describe those things.

Next Nina's kids practiced, a lot, before they tried to produce a final drawing. “The first day, we all draw together. We start with a house. We go step by step and break the house apart into shapes and angles and lines and what each line is doing. For example, to make the house look as if it's receding in space, the front edge would be longer and then gradually narrows toward the back. So I draw it and we copy it together. After they've done that, it starts to make sense. Then I give them half-complete houses that they have to complete. It becomes more independent. Toward the end they draw a full house. Then we start adding horizon lines and landscape details.” By the end of the first day Nina's students have done multiple sketches and drawings of the houses that they may put in their final project.

On day two they practiced the elements of landscape, especially the trees, in the same way. And on subsequent days they got more practice. “On day two for the Do Now we start with what we did previously. I had a half-drawn house for them to complete. On day three I gave them a house and they had to add the landscape with stylized trees,” Nina said.

After a few days of practicing sketching the elements, students did a final draft. “I tell them it's like a test. They have to show that they understand the points that we've hit. There are composition requirements: one house; at least four stylized trees, all of the tress with cast shadows, all of them pointing in the correct direction, aligned with the sun; a mysterious element that suggests missing people. Then they could think about color. They draw a first draft with crayons and colored pencils. They use that to guide their final copy. But before that, we do demos with oil pastels. How to get a vibrant color. How to blend.”

You can see in the story how Nina builds up knowledge piece by piece, helping students to not only understand how to successfully create a larger complex work but also develop skills and knowledge they can apply in future paintings and drawings. In part, this is about managing student's working memory. By allowing them to apply all of their conscious thought in a sustained way to a smaller task she helps them make each element special and the overall result memorable. Certainly the resulting works were full of creativity and self-expression, but there's also such immense respect for the knowledge of craft implicit in being able to create and express. Did students enjoy it? Overwhelmingly. There was such a sense of competence and confidence. They knew how to create. And of course the degree to which one can bring one's creative vision to reality is a result of the knowledge and skill she has built.

Perhaps my perception is wrong, but I don't often see an approach like this used in art classrooms. Students are most often assigned a painting and they have a go at it all at once.

In her outstanding book Making Good Progress, Daisy Christodoulou makes a fascinating point. If teachers don't break down complex tasks into component parts, the alternative is to practice summative tasks as whole exercises. Sort of like doing whole landscape paintings over and over, except that in academic subjects the tasks often become very focused on tested outcomes. That is, teachers do over and over tasks that replicate what students will be assessed on in the end. “Assessment for learning [what teachers in the UK might call data-driven instruction] became excessively focused on exam tasks not just because of the pressures of accountability,” Christodoulou writes, “but because the dominant theory of how we acquire skill suggested it was the best thing to do.” We blame teaching to the test on the tests when part of the problem is our own conception of teaching. Lessons, she says, should look very different from the final skill they are hoping to instill. The goal is to make what's intuitive to experts legible to novices so they can master it.