Previously, in data analysis sessions since September:
Students were having trouble drawing any conclusions, noticing any patterns, or thinking about cause at all when they broke into groups to analyze data the class had generated. It was never enough time, had always been too long since the measurements were taken, and they had too little background knowledge. They floundered and fussed, getting increasingly annoyed and disoriented, while I tried to make them think by sheer force of will, running around steering them away from insignificant details (like, “all the voltages are even numbers”).
Lesson #1: Procedural Fluency
This semester started off the same. I asked them to characterise the I vs. V response of a lightbulb and of an LED, to look for similarities and differences. As I wrote previously, most of them were up to their gills just trying to wrestle their measurement equipment into submission. They finally complete their measurements, but without any awareness of what was going on, what that graph mean, etc.
At my wits’ end, I had them do it again with two other models of diode. To me, this felt almost punitive — like handing someone an identical worksheet and telling them to start over. To try to make it a bit more palatable, I seized on their frustration about how “Mylène’s labs are so LONG” and told them we weren’t going to cover anything new — we were just going to do some speed practice, so I could show them some techniques for increasing speed without sacrificing accuracy.
I helped them strategize about how to set up a table for measurements (they were writing their measurements out in paragraphs… yikes). I also got much more directive than usual, and informed them that everyone was required to used two meters simultaneously (many were using a single meter to switch back and forth between measuring voltage and current… with attendant need to unhook the circuit TWICE for every data point!!). There was big buy-in for this, as they immediately saw that they were going to get an entire data set in a single class. I saved a few minutes at the end of class for students to share their own time-saving ideas with their classmates.
What I didn’t realize was that they had internalized so little information about diodes that blue LEDs seemed like a whole different project than red LEDs. I was worried they would mutiny about being forced to redo something they’d already finished, but I was wrong. They welcomed, with relief, the opportunity to do something that was recognizable, with a format and a set of instructions that they had already worked the kinks out of. Moral of the story: it’s the background knowledge, stupid. (I can hear Jason Buell‘s voice in my head all the time now).
Lesson #2: Distributed Practice
I also realized that asking this group to sit down with some data and analyze the patterns in an hour is not going to happen. I figured it mostly about having enough time (and not feeling pressured) so I started requiring them to keep track of “what did you notice? what did you wonder?” while they were measuring. After they were done measuring, I also required them to write some notes to themselves: explanations of anything in the lab that supported the model, and questions about anything that wasn’t supported by the model or that seemed weird (“When you find some funny, measure the amount of funny.” [Bob Pease of National Semiconductor, probably apocryphal]).
That meant they could take their time, tease out their thoughts, and write down whatever they noticed. When it was time to sit down in data analysis session, they had already spent some time thinking about what was significant in their measurements. They had also documented it.
Lesson #3: Expect them to represent their own data
In the past, I’ve made a full record of the class’s data and given a copy to every students. My intention was that they would come through the evidence in a small group — maybe splitting up the topics (“you look at all the red LEDs — do they all turn on at 1.7? I’ll check the blue ones”) — and everyone would be able to engage with the conversation, no matter whose data we were discussing. My other intention was that they would take better notes if they knew other students would read them. It worked last year … but this year I got extremely tidy notes, written out painstakingly slowly so the writing was legible… with measurements buried in paragraphs.
Last week, I asked everyone to get into small groups with people who were not their lab partner. They were not required to analyze the whole class’s data — only the data of the people in the small group, who would be expected to explain it to the others.
The students loved it because they were analyzing 4 data sets, not 9. So they were happy. I was happy too, because, from out of nowhere, the room exploded in a fury of scientific discourse. “Oh? I got a different number. How did you measure it?” “Does everybody have…?” “Will it always be…?” “Why wouldn’t it…?” “That’s what we’d expect from the model, because…“
I was floored. Since I didn’t have to run around putting out fires, I found my brain magically tuned in to their conversations — I filled an entire 8.5×11 sheet full of skillful argumentation and evidence-based reasoning that I overheard. Honestly, I didn’t hear a single teleological, unscientific, or stubbornly antagonistic comment. Most days I can’t do this at all — I’m too overwhelmed to hear anything but a buzzing cacophony, and they’re too tense to keep talking when I get close.They didn’t even stop talking when I wandered near their desks — they were all getting their foot in the door, making sure their data made the final cut.
It slowed down a bit when I reminded them that they had to have at least one possible physical cause for anything they proposed (i.e. “the materials and design of the diode cause it to not conduct backwards” is not a cause). But they picked it back up, with awesome ideas like
- Maybe the diode acts like a capacitor — it stores up a certain amount of energy
- Maybe the diode only takes whatever energy it needs to light up, and then it doesn’t take any more
- Maybe the lightbulb’s resistance went up because it’s a very narrow filament, but it has low resistance. So when all the current rushes in, there’s no room for more electrons, and that restricts current.
- Maybe a diode has a break inside, and it takes a certain amount of voltage to push the electrons through the gap. It’s like shooting electrons out of a cannon — they need a certain force to make it over a ravine.
- How come electrons in a silicon crystal “bond” and make a pair? I thought they orbit around the nucleus because electrons repel each other.
- If a leaving electron creates a positive ion, wouldn’t that attract the same electron that left?
These are not canonical, of course. But they’re causes! And questions! And they have electrons!! I was so excited. The students were having fun too — I can tell because when they’re having fun, they like to make fun of me (repeating my stock phrases, pretending to draw from a deck of cards to cold call someone in the audience, etc etc.)
Moral of the story
1. During measurement, you must write down what you noticed, what you wondered/didn’t know.
2. After measurement, you must write down which parts of this the model can explain (students call this “comparing the data to the model.”) This causes students to actually pull out the model and read it. Awesome.
3. Anything that can’t be explained by the model? Articulate a question about it.
4. If that’s still not working well, and I’m still getting into a battle of wills with students who say that the model doesn’t explain anything about diodes, do the same lab again. Call it speed practice.
Then, when we share data and propose new ideas to the model, they’ve already spent some time thinking about what’s weird (no reverse current in a diode), what’s predicted surprisingly well by the model (forward current in a diode) and what’s predicted surprisingly badly (current in a lightbulb). When we sit down to analyze the data, they’re generating those ideas for the second or third time, not the first.
5. Stop making copies of everyone’s data — it allows one strong and/or bossy student to do all the analyzing. Require that the whiteboards include an example from every person’s data.
6. Watch while they jump in to contribute their own data, compare results and ideas about “why,” facilitate each others’ participation, summarize each others’ contributions, challenge, discuss, and pick apart their data according to the model.
7. Realize that since I’m less overwhelmed with needing to force them to contribute constructively, I too have much more cognitive capacity left over for listening to the extremely interesting conversations.