Could there be anything more boring than regulatory standards for quality assurance? I’m teaching a new-to-me course in the 5-week intersession on the IPC Requirements for Soldered Electrical and Electronic Assemblies.  I was mentally prepared to hate it, along with its vendor-supplied Powerpoint® presentations and multiple-choice tests.  Luckily, a few things rescued it for me:

  1. We’re not accredited by the regulating agency, so as long as I meet the outcomes, I don’t have to stick to the supplied curriculum
  2. I’ve been looking for an opportunity to do some reading comprehension instruction and boy, did I get my wish.
  3. Talking about regulations can lead to “why” questions about how industrial automation affects the small-town and rural region where we live, ethics, and craftsmanship.  In order to have those conversations we needed to understand, use, and critique the lingo.

I started with these assumptions:

  • My students are able to decode, respond to, and draw conclusions from the everyday text of adult life (newspaper articles, emails)
  • They can generally find the main idea in a textbook passage, but will then complain that they “don’t get it” and “can’t teach myself from books.”
  • There are a few people with print-related learning disabilities in the room, and we check in regularly to strategize about that, but so far they don’t seem to be having different difficulties than their peers.

So I decided to call it “technical reading,” mostly so that the students wouldn’t think I was accusing them of being illiterate.  Resentment of “book-learning” and the classism that often goes along with it is a sore point for a lot of tradespeople, so it required a bit of care.  Here’s what I mean by technical reading.  When I’m reading a newspaper article about the recent election, I may not know who won, but I already know what an election is.  In a textbook, I am asking students to think about new concepts, in addition to new ways of connecting old concepts. Mortimer Adler’s book How To Read a Book calls this “analytical reading” [corrected — M] and distinguishes its strategies from basic reading.

By accident, I recently learned from my most reluctant readers that this semester’s lab book is “way easier to understand,” even though it’s no different in style than last semester’s.  I suspect it’s because I’ve stopped assigning “Lab 31” and started assigning “predict the effects of AC and DC on a transformer.  Build a circuit to test your predictions.” Having a purpose apparently made the reading seem both easier and better-written.

I started with vocabulary.  I know, yuck.  But when I read through the first two quizzes (overview of the main ideas), I counted fifty terms that my students would likely not recognize, or not know their meaning in this context.

So I tried my hand at designing a “reading comprehension constructor.”  Cris Tovani writes in Do I Really Have To Teach Reading about how to design these: they are scaffolds for particular comprehension strategies.  I decided to show a variety of strategies and ask them to choose 1-2 that they found most helpful. (Click through for a bigger copy)

  At our first class meeting, after introducing the basic idea of the course, I handed this out.  I explained that each row is a main idea, and that each column is a different strategy for understanding and remembering.  I talked through my thinking by filling in the top line, and told the students that they were free to use any one strategy or more than one.  Then I asked them to fill in as many as they could, and put their count at the top of the sheet.

The goal here was to put their brains on alert about terms that are important, and to set them up for a win when their count goes up at the end of the day.  I arranged the terms in the order that we would encounter them.

High Reliability Soldering BingoThen, we played vocabulary bingo.  Students marked their bingo cards when I mentioned one of the technical terms (they’re the same ones from the handout above, but arranged alphabetically). The boxes had to be marked with either a definition or the section of the standard that introduced the term.  To win, you had explain the meanings of 5 terms in a row.  The prize was a 10 minute break for the class, to be used at the winner’s discretion (because the intersession is compressed, we have a full day of class in two 3h blocks).

I spent about 90 min in the morning introducing these ideas, and another hour in the afternoon (I know, deadly.  Lots more to improve for next year… in hindsight, I should have been taking regular breaks for people to update their Key Concepts handout).  Lots of me talking, with occasional questions, short whole-group discussions, and videos.  We had our regular breaks, and two extra breaks on account of people winning bingo.

At the end of the presentation, with 90 min left in our afternoon block, I asked the group to return to the Key Concepts handout, update the information they had written that morning, and fill in any gaps.  This took most people another 45 -60 minutes (there are 50 terms, remember).  There are some blank spaces at the end for any terms a student wants to remind themselves of.   They handed them in, I read them and wrote back.  These became our custom “dictionaries” for the rest of the course.

How I Assessed Their Comprehension

  • Written process control plans

Ungraded.  Every one of them was useable.

  • Individual conversations about their soldering and inspection

Graded.  All students have assessed their soldering at least once so far, using their process control plans as a rubric.

  • Multiple Choice Tests

88% of scores were 70 or better, across 5 tests

  • Debates about interpretations of multiple choice questions

Ungraded, obviously, but fascinating and great source of clues about their comprehension.  The conversations we’ve had after quizzes have shown a remarkable degree of finesse.  The “why” questions I mentioned at the top came out in spades (environmental legislation, social consequences of industrial automation, economics, international relations, ethics, craftsmanship, etc).  Students clearly related these to supporting evidence in the spec.

Did It Work?

Overall, I think it was helpful. I’ve gotten questions about individual terms, but no generalized “I don’t get any of this” frustration.  Questions about the meanings of words generally came up in private conversations, and we looked back at their “dictionary” together to find clues or fill in blanks.The group is using this terminology fluently and arguing about subtle interpretive points, which I think is pretty impressive considering how recently they learned it — not to mention the density of the text (see example at the top!)