A Message From Academic Affairs
Why Don’t They Learn More?
Last summer at an assessment conference, Dr. Kurt Fischer, director of Harvard’s Center for Mind, Brain, and Education, discussed the challenges of higher-level learning under the title of “Constructing Robust Knowledge: Learning is Harder than We Think.” The last part of his title was more than a rueful lament that we all have felt at the end of the term when we realize that some students just didn’t seem to get it. What stood out for me were Fischer’s case studies that provided insights into the challenge of fostering higher-level learning in our students.
I’ll share his simplest example; you can decide any applicability to your own field and your own classes. Fischer and Lee Zheng studied student learning in a very popular introductory undergraduate statistics course, one so renowned at Harvard campus that students took it as an elective. The student evaluations were glowing, and faculty sat in to learn from gifted teacher at work. It is the kind of course students remember fondly years later.
How well did student learn in this fabled course? Because the coursework contained well-defined problems and students’ progress could be tracked by computer, the class allowed Fischer and Zheng to examine carefully how students learned. By the course's end, the researchers had a very detailed semester's worth of data about how and how well the students had learned statistics, what they knew and what they didn’t know, and the process by which they had learned—or not learned.
Fischer and Zheng’s data showed an intriguing pattern. About 10 percent of students, what they termed the “Stable Experts,” started the class at a high level of skill in statistics and finished the course with high ability. Close to 50 percent of students showed evidence of learning, demonstrating clear progress in gaining mastery of statistics. Surprisingly, about 30 percent of students did worse, and about 15 percent showed “chaotic” change, that is, they learned little and their successes and failures demonstrated no discernible pattern. Viewed pessimistically, the data suggests that almost half the class showed no gains.
We don't want to generalize glibly from this one class or from a study whose methodology may be inappropriate for the varied kinds of teaching we do. Maybe some of the “chaotic” students were in a period of creative confusion and would “get it” later. Maybe these students had trouble with expressing ideas via mathematics but still gained an understanding of the value and limits of statistics in creating knowledge. Even so, Fischer and Zheng’s work reminds us how difficult complex learning can be.
We may take heart from knowing that even when we do our very best, some students will fall short. Should we conclude that we're grading too easy? If half a class is not competent with course material, surely there ought to be more C's (I won't risk suggesting lower grades!) Or alternately maybe we ought to be a soft touch, since in a given term some students may not be ready to master our course material.
Are there implications for general-education courses? While we may want to hold senior majors to our most rigorous standards, what about expectations for an LP or LC course? What do we want students to take away from our course, given that some students may not master its content? What assignments and activities will make it likely more students make the gains we hope for them?
Fischer and Zheng’s study should not lead to a sense of quietude. At its best, systematic assessment of student learning in addition to grading can help us understand where we are doing well and where we can do better. Such work helps us see with greater precision the areas we want to address as we try to do even better in our important work of teaching.
If you want a copy of the PowerPoint slides of Fischer's talk, please write to me. The slides provide summaries of some of the other studies and set forth some ways that Fischer and his colleagues are attempting to understand the neurological bases of learning.
– Michael Nolan, Associate Dean of Assessment and Grants