“I can’t believe that this [workmanship] would be acceptable in a pacemaker. I really see why electronics fail so often.” (student)
The last unit of my High Reliability Soldering course was on surface-mount components. I have tried to gradually release to the students the responsibility for reading, understanding, and making decisions based on the specification document.
I introduced two especially tricky sections of the surface-mount chapter with an exercise designed to help students figure out what was confusing. Then we practiced using the de-confusification techniques shown below.
In case you can’t see the document, the main headings are
- Choose a purpose
- Find the confusion
- Check for mental pictures/descriptions
- Use structural clues
- Make connections to what you already know
- Ask questions/make inferences
This is a simplified version of what’s common to all the books about reading comprehension I looked at (see the Recent Reads page for my notes and reviews). I’ve boiled it down to techniques that I think will be most helpful for reading non-fiction about unfamiliar topics.
In class, we reviewed each technique, and came up with examples from our conversations that demonstrated them. Then I learned this incredibly useful piece of information: 9/10 people in the class did not know what an inference was.
Sometimes I’m really slow.
After a moment of feeling deflated, I realized that this is exciting. My students sometimes get angry and frustrated when I ask them to make predictions. If they don’t have a way to distinguish valid inferences from wild guesses, then predictions probably seem pretty futile. In September, we’ll spend some time learning to judge if an inference is valid. Our reading comprehension techniques are leading us directly toward evaluating the validity of arguments — something I’ve tried to figure out how to weave into my curriculum.
As preparation for the test, I asked them to apply one of these techniques to each confusing idea they had identified, or for any other idea in Chapter 7 that they weren’t sure about (see p. 2 of the handout, above). If they passed them in by the end of the day, I would write back and respond to their questions and comments. The students know I can’t always be trusted to give straight answers to questions, so I thought that the offer of direct answers would be a powerful incentive. Yet only one person turned in some “confusions.” Now, that person was the one who most needed it, but I was still a bit worried.
One way to look at it: this proves that my crazy grading system ensures that most people will not complete independent practice.
Another way: Two days later, everyone in the class passed the test on Chapter 7. And not bare passes either: no marks below 70, most in the 80s and 90s. The only section that I “taught” (in the lesson on identifying confusion) was the intro. Even when students did in-class exercises reading sections 7.1.1 and 7.1.2, I never did explain what they meant. Yet somehow they got excellent grades on test full of unfamiliar concepts. They did this by finding and interpreting information in a specification document that is written in the obscure language of process control and quality assurance. They answered questions about concepts that I had never explained (i.e. heel and toe fillets, edge overhang, component cant).
While reviewing the test, I asked students to explain the thought process behind their answers. For most questions, I got not one but several examples of how to substantiate their arguments for their answers. They were also able to point out a question that had no unambiguously correct answers from among the choices. Because they know I can be swayed by well-substantiated arguments, most of the class contributed quotations from the specification demonstrating that this question was poorly posed.
Over the next two weeks, I inspected their soldering and discussed it with them. They fluently used the ideas in the standard to describe their soldering and the faults they found. When I asked a question, they were able to find the answer in the standard. When they were unsure of something, they approached me and pointed to a sentence in the standard, not to the entire thing. But mostly, they were focused on the ideas, not the reading. For some of them, this might be the first time they’ve experienced text as a window, rather than a wall. They used the standard to support their arguments about cost-benefit ratios, manufacturing philosophies, and planned obsolescence.
They read it. They really read it.
YEA! very encouraging
[…] learning, Reading comprehension Working on reading comprehension has taught me lots about how my students see text, or don’t see it. It also taught me about how I see […]
[…] That helped. We started to have a way to talk about the difference between what an author means and what I think. Because of that, I discovered that my students had no idea what an inference was. […]
[…] skills we work on include technology, technical reading (stealing that name from Mylène ), studying strategically, being a independent learner… I know there are more, but I […]
[…] I suspect that defining the problem and explaining its importance is missing from the exercises that Mylène gives her students, as they are generally working on well-defined problems posed for them, and not coming up with new research topics or design problems that need to be justified. I’m sure that she can come up with ideas that are relevant to technician-trainees for practicing this skill, perhaps tying in it with her “technical reading” training (see, for example, her post Reading Comprehension Techniques: Review). […]