SBG superhero

I stole this graphic from Kelly O’Shea. If you haven’t already, click through and read her whole blog.

By last winter, the second year students were pretty frustrated.  They were angry enough about the workload to go to my department head about it.  The main bone of contention seemed to be that they had to demonstrate proficiency in things in order to pass (by reassessing until their skills met the criteria), unlike in some other classes where actual proficiency was only required if you cared about getting an A.  Another frequently used argument was, “you can get the same diploma for less work at [other campus.]” Finally, they were angry that my courses were making it difficult for them to get the word “honours” printed on their diploma.  *sigh*

It was hard for me to accept, especially since I know how much that proficiency benefits them when competing for and keeping their first job.  But, it meant I wasn’t doing the Standards-Based Grading sales pitch well enough.

Anyway, no amount of evidence-based teaching methods will work if the students are mutinous.  So this year, I was looking for ways to reduce the workload, to reduce the perception that the workload is unreasonable, and to re-establish trust and respect.  Here’s what I’ve got so far.

1. When applying for reassessment, students now only have to submit one example of something they did to improve, instead of two.  This may mean doing one question from the back of the book.  I suspect this will result in more students failing their reassessments, but that in itself may open a conversation

2. I’ve added a spot on the quiz where students can tell me whether they are submitting it for evaluation, or just for practise.  If they submit it for practise, they don’t have to submit a practise problem with their reassessment application, since the quiz itself is their practise problem.  They could always do this before, but they weren’t using it as an option and just pressuring themselves to get everything right the first time.   Writing it on the quiz seems to make it more official, and means they have a visible reminder each and every time they write a quiz.  Maybe if it’s more top-of-mind, they’ll use it more often.

3. In the past, I’ve jokingly offered “timbit points” for every time someone sees the logic in a line of thinking they don’t share.  At the end of the semester, I always bring a box of timbits in to share on the last day.  In general, I’m against bribery, superficial gamification (what’s more gamified than schooling and grades??), and extrinsic motivation, but I was bending my own rules as a way to bring some levity to the class.  But I realized I was doing it wrong.  My students don’t care about timbits; they care about points.  My usual reaction to this is tight-lipped exasperation.  But my perspective was transformed when Michael Doyle suggested a better response: deflate the currency.

So now, when someone gives a well-thought-out “wrong” answer, or sees something good in an answer they disagree with, they get “critical thinking points“.  At the end of the semester, I promised to divide them by the number of students and add them straight onto everyone’s grade, assuming they completed the requirements to pass.  I’m giving these things out by the handful.  I hope everybody gets 100.  Maybe the students will start to realize how ridiculous the whole thing is; maybe they won’t.  They and I still have a record of which skills they’ve mastered;  and it’s still impossible to pass if they’re not safe or not employable. Since their grades are utterly immaterial to absolutely anything, it just doesn’t matter.  And it makes all of us feel better.

In the meantime, the effect in class has been borderline magical.  They are falling over themselves exposing their mistakes and the logic behind them, and then thanking and congratulating each other for doing it — since it’s a collective fund, every contribution benefits everybody.  I’m loving it.

4. I’ve also been sticking much more rigidly to the scheduling of when we are in the classroom and when we are in the shop.  In the past, I’ve scheduled them flexibly so that we can take advantage of whatever emerges from student work.  If we needed classroom time, we’d take it, and vice versa.  But in a context where people are already feeling overwhelmed and anxious, one more source of uncertainty is not a gift.  The new system means we are sometimes in the shop at times when they’re not ready.  I’m dealing with this by cautiously re-introducing screencasts — but with a much stronger grip on reading comprehension comprehension techniques.  I’m also making the screencast information available as a PDF document and a print document.  On top of that, I’m adopting Andy Rundquist’s “back flip” techniquescreencasts are created after class in order to answer lingering questions submitted by students.  I hope that those combined ideas will address the shortcomings that I think are inherent in the “flipped classroom.”  That one warrants a separate post — coming soon.

The feedback from the students is extremely positive.  It’s early yet to know how these interventions affect learning, but so far the students just seem pleased that I’m willing to hear and respond to their concerns, and to try something different.  I’m seeing a lot of hope and goodwill, which in themselves are likely to make learning (not to mention teaching) a bit easier.  To be continued.

Last week in class, I showed some student examples of authentic, non-canonical thinking.  I asked the class to identify what they saw as good in those examples.  Here’s what they said:

“It’s honest.

It talks about electrons and energy.

It talks about physical cause.

It’s about the real world.

They noticed patterns.

They used analogies and metaphors.

They broke the ideas into parts.

They asked questions — clarifying and precision.

The were trying to find the limits of when things were true.

They said they didn’t know.

They proposed a hypothesis.

They said what seemed weird.

At a glance, it seems people wrote more than usual that day about what they think might be happening in their circuits.  I’ll post more when I read them… but I’m hopeful.

My last post was about encouraging my students to re-evaluate what they think is certain.  I’m trying to help them break the habit of arguing from authority, and encourage them to notice their own thinking… and even to go so far as exposing that thinking to the class!  That’s going to be scary, and it depends on creating a supportive climate.

I responded to a comment on that post, in part: “I do realize that I’m pulling the rug out from under their trust in their own perception of reality, and that’s an unpleasant experience no matter what. Sometimes I think this is actually a spiritual crisis rather than a scientific one.”  To be fair, I’m careful not to suggest that their perception is invalid; only that it is important to notice the evidence that underlies it.  But that means considering the possibility that there isn’t any, or isn’t enough.  In the conversations that follow, the students talk about wondering whether certainty exists at all, and whether anything exists at all, and what knowing means in the first place.  That leads to what it means to “be right”… and then what it means to “do right.”

My best guess is that they have tangled up “right and wrong test answers” with “right and wrong moral behaviour” — being a “good person” means being a good student… usually a compliant one.

So, I’m provoking a moral, or maybe a spiritual, crisis — or maybe exposing an underlying crisis that was there all along.  What do I do about it?  How do I help students enter into that fear without being immobilized or injured by it?  They don’t know what to do when the rigid rules are removed, and I don’t know what to do when they get scared.  What do we do when we don’t know what to do?

Our classroom conversations range over ontology, epistemology, ethics, and, yes, faith. I realize I’m treading on thin ice here; if you think opening a conversation about faith and spirituality in my classroom (or on this blog) is a mistake, I hope you’ll tell me.  But I don’t know how to talk about science without also talking about why it’s not faith, to talk about truth and integrity without talking about what it means to do what’s “right”, why all of these might contribute to your life but one can’t be treated as the other.  And it’s a line of conversation that the students dig into avidly, almost desperately. Putting this stuff on the table seems to offer the best possibilities for building trust, resilience, and critical thinking.

So when the students open  up about their fear and anger around what “right and wrong” can mean, I go there (with care and some trepidation).  I’m careful not to talk about particular sects or creeds — but to invite them to think about what they think of as morally right and wrong,  and why models of atomic structure don’t fit into that structure.

There is occasionally some overlap though.

A historical figure I’ve learned a lot from wrote in her journal about re-evaluating an especially weighty authority…

And then he went on … “Christ saith this, and the apostles say this;’ but what canst thou say?” …  This opened me so, that it cut me to the heart; and then I saw clearly we were all wrong. So I sat down … and cried bitterly… “We are all thieves; we are all thieves; we have taken the [ideas] in words, and know nothing of them in ourselves.

Since this belongs to a particular faith community, I don’t bring it into the classroom.  I think about it a lot though; and it’s the spirit I hope students will bring to their re-evaluation of the high school physics they defend so dearly.

If I expect them to respect the “wrong” (bad?  EVIL??) thinking of their classmates, it’s crucially important that they feel respected.  If I want them to stop arguing from authority, I have to be meticulous about how I use mine. One technique I’m going to try tomorrow is sharing with the class some of the “cool moves” I noticed on the most recent quiz.

Despite my angst about this issue, I’m actually thrilled by the curious, authentic, and humble thinking that’s happening all over the place.  So tomorrow I’ll show some of these (anonymous) examples of non-canonical ideas and explain what I think is good about them.  I’ll especially make sure to seek out a few from the students who are the main arguers from authority.

2 3 4 5 6 7 8 9 10

Keep calm because I already knowCreating a classroom culture of inquiry is getting better and better every September in most ways. It’s especially working well to reassure the students with little previous physics experience, to excite the students with previous unpleasant experiences with physics, to challenge the students who found previous physics classes boring or stifling, and to empower students who’ve been marginalized by schooling in general. But one thing I’m still struggling with is responding well to the students who have been taught to uncritically regurgitate correct answers — and who’ve embraced it.

How do I get curious about their ideas?  My conflict mediation coach suggests finding out what need that meets, what they got from that experience that they’re not getting elsewhere.  I confess that I’m afraid to find out.  I’m also afraid of the effect they have on the other students.  Their dismissive insistence that other people’s theories are “wrong” can quickly undo weeks of carefully cultivating a spirit of exploring and evaluating the evidence ourselves; their pat answers to other people’s questions make it seem like it’s stupid to be curious at all.

I have a bunch of options here… one is an activity called “Thinking Like a Technician” where I introduce the idea that “believing” is different from provisionally accepting the theory best supported by the current evidence.  I show the Wikipedia page for atomic theory to draw out the idea that there are many models of the atom, that all of them are a little bit wrong, and that our job is to choose which one we need for which situations, rather than to figure out which one is right.  That seems to help a bit, and give us some points of reference to refer back to.

I show a video with Malcolm Longair and Michio Kaku explaining that atoms are made of mostly nothingness.  But I think it makes it worse.  The students who are excited get more excited; the ones who feel like I’m threatening the authority of the high school physics teachers they idolize get even angrier.  For the rest of the class, it’s wonderful — but for this subset, it’s uncomfortably close to Elicit-Confront-Resolve.  They experience it as a form of “expose-and-shame“, and unsurprisingly retaliate.  If they can’t find some idea of mine to expose and shame, they’ll turn on the other students.

Something I’m trying to improve: How do I help students re-evaluate things that seem solid?  It’s not just that they respond with defensiveness; they also tend to see the whole exercise of inquiry (or, as some people call it, “science”) as a waste of time.  What could make it worth re-examining the evidence when you’re that sure?

 

My definition of “inquiry” as an educational method: it’s the students’ job to inquire into the material, and while they do that, it’s my job to inquire into their thinking.

So yes, the goal is really “inquiry-based learning”.  I’ve written lots before about what the students do.  But this post is about what I do. I have to inquire at least as much as the students do.

I’ve written that before, more than once… but do you think I can find it on my own blog?  Nope.  Also, I stole it originally, probably from Brian Frank.  Do you think I can find it on his blog?  *sigh*  If anyone finds it, in either place, let me know, would ya?

What’s new about my ability to inquire into my students’ thinking is that I’m treating it more like a qualitative research project.  Someday I’ll go take a qualitative methods course and actually know what I’m talking about (I’m taking suggestions for methods texts, courses you liked, or profs you respect)… but until then, I’m muddling through better than usual.

Activities That Help Me Inquire Into Student Thinking

Playdough Circuit

Published by Science Buddies

We spend the first week doing things that are designed for them to play with their current ideas and me to learn to find out about them.  In the past I set out piles of AA batteries, light bulbs, sockets,  and switches.  I’d ask students to build a circuit that worked, one that looked like it should but didn’t, and a third one of any description.  Students drew their circuit on paper and wrote down what they noticed, as well as what they wondered (props to Brian again for the wording of the prompt, which helps break down the fear induced by writing the “wrong” thing in a lab report “observation” section).  The noticing and wondering helps me learn a lot about their ideas.

This year I added a day before light bulbs where they made circuits out of playdough.  It was silly, messy, and fun.  It also yielded lots of new info about their thinking about electrons, voltage, current, charge, etc., which I asked them to record on this handout.

.

Record-Keeping

Whatever they write down ends up in a spreadsheet that looks like this:

2015 Intake ideas so far Name Date Context V R I P C Energy Potential
voltage is potential difference amount of potential energy between points XXXXXX 09-Sep-15 Squishy Circuits x x x
Insulators stop energy from passing through XXXXXX 09-Sep-15 Squishy Circuits x
Conductors allow the transfer of energy XXXXXX 09-Sep-15 Squishy Circuits x

.

I just keep adding tags on the right to keep track of whatever topic I need to keep track of.  That way I can sort by topic, by date, or by student.  It also helps me see which activities yielded what kind of curiosity.

My Favourite Ideas So Far

What holds matter together?

Are electrons what power actually is?

Batteries in a row must connect to each other like how magnets connect together to attract each other (2 negatives connected doesn’t work)

Closing the switch should double the power supply, but there was no noticeable difference. Why?

When negative side of battery reaches positive side of other battery, shouldn’t it be a complete circuit?

Put the switch on the other side of the bulb.  Does it matter?

Why did the 2 dim lights light at all, when the path of least resistance was through the 1 light bulb path?  In my “double the wires” circuit, they didn’t light at all.

Why don’t any of the bulbs turn on?  I would have thought that at least the first bulb would faintly glow.

Resistance is how much current is lost in the current

What separates Watts from Volts?

If I Inqire Into My Own Thinking…

What’s the pattern here about which ideas are exciting to me?  Well, quite a few of them are challenges to common misconceptions.  Despite my resistance, it seems I’ve still got a bit of a case of misconception listening.

The other pattern is that they all point either to questioning cause, or improving precision.  Those are discipline-specific skills, part of the “implicit curriculum” that people in my field often think of as unlearnable “aptitudes” instead of skills.  So there’s a practise of inclusion underlying my choices — making these skills explicit benefits everyone but especially the people with little previous exposure to technical fields.  Cause and precision are also things that I personally find satisfying and beautiful.  No coincidence about the overlap — I chose my field for a reason.  I’ll have to be careful to encourage curiosity wherever I find it, not just in the students who ask the kinds of questions I like best.

Last week, I presented a 90-minute workshop on Assessment Survival Skills during a week-long course on Assessing and Evaluating.  Nineteen people attended the workshop.  Sixteen were from the School of Trades and Technology (or related fields in other institutions).  There were lively small-group discussions about applying the techniques we discussed.

Main Ideas

  1. Awesome lesson plans can help students learn, but so can merely decent lesson plans given by well-rested, patient teachers
  2. If grading takes too long, try techniques where students correct the mistakes or write feedback to themselves
  3. If they don’t use feedback that you provide, teach students to write feedback, for themselves or each other
  4. If students have trouble working independently in shops/labs, try demonstrating the skill live, creating partially filled note-taking sheets, or using an inspection rubric
  5. If you need more or better activities and assignments quickly, try techniques where students choose, modify, or create questions based on a reference book, test bank, etc.
  6. If students are not fully following instructions, try handing out a completed sample assignment, demonstrating the skill in person, inspection reports, or correction assignments

When I asked for more techniques, the idea of challenging students to create questions that “stump the teacher” or “stump your classmates” came up twice.  Another suggestion was having students get feedback from employers and industry representatives.

Participants’ Questions

At the beginning of the workshop, participants identified these issues as most pressing.

New Doc 47_1

Based on that, I focused mostly on helping students do their own corrections/feedback (#3), and how to generate practice problems quickly (#5).  Interestingly, those were the two ideas least likely to rate a value rating of 5/5 on the feedback sheets — but the most often reported as “new ideas”.  I think I did the right thing by skipping the techniques for helping students follow instructions (#6), since that was the idea people were most likely to describe as one they “already use regularly.” Luckily, the techniques I focused on are very similar to the techniques for addressing all the concerns, except for a few very particular techniques about reducing student dependence on the instructor in the shop/lab (#4), which I discussed separately.  I received complete feedback sheets from 18 participants and 16 of them identified at least one idea as both new and useful, so I’ll take that as a win.  Also, I got invited to Tanzania!

Participants talked a lot about what it’s like to have students who all have different skills, abilities, and levels of experience.  Another hot topic was how to deal with large amounts of fairly dry theory.  We talked a lot about techniques that help students assess their skills and choose what content they need to work on, so that students at all levels can challenge and scaffold themselves.  We also talked about helping students explore and choose and what format they want to use to do that, as a way of increasing engagement with otherwise dry material.  I didn’t use the term, but I was curious to find out in what ways Universal Design for Learning might be the answer to questions and frustrations that instructors already have.  If I ever get the chance, as many participants requested, to expand the workshop, I think that’s the natural next step.

Feedback About the Workshop

Overall feedback was mostly positive. Examples (and numbers of respondents reporting something similar):

“Should be a required course”

“I liked the way you polled the class to find out what points to focus on,” “tailored,” “customized” (4)

“Well structured,” “Interactive” (7)

“Should be longer” (11)

“Most useful hour and a half so far” (4)

Feedback About Handout

“If someone tries to take this from me, there’s gonna be a fight!”

Feedback About Me

“Trade related information I can relate to” (4)

“High energy,” “fun,” “engaging,” “interesting” (5)

“You were yourself, didn’t feel scripted,” “Loved your style,” “Passionate” (3)

“That’s the tradesperson coming out!”

This morning, I’m presenting a workshop on Assessment and Evaluation Survival Skills.  The themes are

  • How to help students learn
  • While giving fair and accurate grades
  • Without losing your mind.

 

Stay tuned for an update on the questions and techniques that emerge.

Participant handout (DOC, PDF)

Feedback form template (DOC, PDF)

 

 

Painting of woman looking into hand mirror with egg on her faceThe latest post at Educating Grace is about breaking down classroom cultures of “fishing for or steering people towards the right answer, treating wrong answers as dangerous, only valuing people who give right answers.”  My comment got so long that Blogger wouldn’t accept it — so I’m posting here instead.

Grace starts by posting a short video clip of a PD session with math teachers, focusing on a moment where a math teacher tries to come up with a non-standard algorithm but ends up getting the wrong answer.  You should go watch it now.

I actually found it hard to watch.  I felt uncomfortable with the responding teacher’s growing embarrassment, as well as with the vocal performance of her embarrassment.  In the moment, I interpreted the stage-whispers she shared with a seat-mate as a way of letting the rest of the room know that she knew, of course she knew why it was wrong.  Which goes back to Grace’s point — we are in a culture where it would be shameful not to know, where mistakes require some gesture of face-saving.  If it was uncomfortable for me to watch, it makes me think of how squirmy it must make students…

1.  I wanted to spend less time unpacking the idea when it was first mentioned, not more. Maybe because she loses face more for every minute she continues to make the mistake?  But maybe also because asking someone to repeat their point is (in the generic classroom of my imagination) often a cue that the teacher wants you to say something else.

So I was imagining myself writing down her process as soon as she said it, and collecting more.  Sometimes I have had success undermining the cult of correctness by putting some distance between the speaker and the strategy.  After there are 5 or 6 strategies on the board, especially if they don’t all match, I can go back and ask students to think about the pros and cons of each one.

2. Another strategy that sometimes helps me is getting people to pool their answers in small groups, and report back as a group.  This doesn’t solve the problem — they will still tend to correct each other, argue, and be mortified if their solution is different from their group-mates’ — but it means the loss of face happens in front of fewer people, where it might be more manageable.

Sometimes I explicitly help students practise recording all the strategies from their group and reporting all of them — I encourage them to discuss the differences without trying to convince the others.  Their default strategy for listening is often “decide whether it’s right or wrong”, so just telling them to stop doing that doesn’t work as well as giving them something else to do: “try to figure out why a reasonable person might think that.”

3. Another thing I don’t do as often as I would like is asking people to record all the strategies that they think work, and then strategies that look plausible but don’t work.  Recording *all* of them on the board, asking people not to say which ones are which, helps break down the assumption that all things written on boards are automatically true.  This is somewhat inspired by Kelly O’Shea’s “Mistake Game“.

When we are looking through strategies that don’t work, I go back to the class with “why might a reasonable person think this.”  With teachers, we might be able to deflect attention away from ourselves by asking, “This is a tempting strategy that a student could easily use.  Why might a student think this?  If they did, what could help them sort it out?”

A related approach is, “what question is this the right answer to?”  Where one commenter on the original post found something good about the strategy’s algebra, I’m finding something good about the strategy’s heuristics.  I’m not thinking about what you should actually multiply the denominator by.  I’m thinking, “that would be a good strategy if we were maintaining the same speed and trying to figure out how far we got in 1.5 hours (27 miles).” In this question we’re maintaining the same distance, and asking how fast, not how far…. but it’s still an example of using the previous problem to solve a new one.

In the debrief, I found myself wanting to talk about, how easy it is to answer a different question than we meant to, especially if we’re trying to do things in a new way.  This must happen to students all the time — they likely have some experience with speed and distance, and some comfortable ways of thinking about them.  We’re asking them to think about familiar things in unfamiliar ways, and that’s going to be disorienting.  It points to the idea that we have to be careful in our assessments — just because someone gives that kind of answer, it doesn’t mean they don’t understand speed and distance.  In fact, it might be an indicator of a new layer of connectedness in our thinking — similar to what Brian Frank refers to as “U-shaped development.”

The fact that the responding teacher was deliberately trying to come up with a non-standard algorithm shows intellectual courage and autonomy, traits I want to encourage in my students.  What helped her develop that courage?  How could we help our students develop it?  I’d be curious to hear the answers from the teachers in the PD session.

XKCD Comic: The erratic feedback from a randomly-varying wireless signal can make you crazy.I’m thinking about how to make assessments even lower stakes, especially quizzes.  Currently, any quiz can be re-attempted at any point in the semester, with no penalty in marks.  For a student who’s doing it for the second time, I require them to correct their quiz (if it was a quiz) and complete two practise problems, in order to apply for reassessment. (FYI, it can also be submitted in any alternate format that demonstrates mastery, in lieu of a quiz, but students rarely choose that option).

The upside of requiring practise problems is eliminating the brute-force approach where students just keep randomly trying quizzes thinking they will eventually show mastery (this doesn’t work, but it wastes a lot of time).  It also introduces some self-assessment into the process.  We practise how to write good-quality feedback, including trying to figure out what caused them to make the mistake.

The downside is that the workload in our program is really unreasonable (dear employers of electronics technicians, if you are reading this, most  hard-working beginners cannot go from zero to meeting your standards in two years.  Please contact me to discuss).  So, students are really upset about having to do two practise problems.  I try to sell it as “customized homework” — since I no longer assign homework practise problems, they are effectively exempting themselves from any part of the “homework” in areas where they have already demonstrated proficiency.  The students don’t buy it though.  They put huge pressure on themselves to get things right the first time, so they won’t have to do any practise.  That, of course, sours our classroom culture and makes it harder for them to think well.

I’m considering a couple of options.  One is, when they write a quiz, to ask them whether they are submitting it to be evaluated or just for feedback.  Again, it promotes self-assessment: am I ready?  Am I confident?  Is this what mastery looks and feels like?

If they’re submitting for feedback, I won’t enter it into the gradebook, and they don’t have to submit practise problems when they try it next (but if they didn’t succeed that time, it would be back to practising).

Another option is simply to chuck the practise problem requirement.  I could ask for a corrected quiz and good quality diagnostic feedback (written by themselves to themselves) instead.  It would be a shame, the practise really does benefit them, but I’m wondering if it’s worth it.

All suggestions welcome!

I heart zero

Here are some conversations that come up every year.

1. Zero Current

Student: “I tried to measure current, but I couldn’t get a reading.”

Me: “So the display was blank?”

Student: “No, it just didn’t show anything.”

(Note: Display showed 0.00)

2. Zero Resistance

Student: “We can’t solve this problem, because an insulator has no resistance.”

Me: “So it has zero ohms?”

Student: “No, it’s too high to measure.”

3. Zero Resistance, In a Different Way

Student: “In this circuit, X = 10, but we write R = 0 because the real ohms are unknown.”

(Note: The real ohms are not unknown.  The students made capacitors out of household materials last week, so they have previously explored that the plates have approx. 0 and the dielectric is considered open)

4. Zero Resistance Yet Another Way

Student: “I wrote zero ohms in my table for the resistance of the battery since there’s no way to measure it.”

What I Wonder

  • Are students thinking about zero as indicator that means “error” or “you’re using the measuring tool wrong?”  A bathroom scale might show zero if you weren’t standing on it.  A gas gauge shows zero when the car isn’t running.
  • When students say “it has none” like in example 2, what is it that there is none of? They might mean “it has no known value”, which might be true, as a opposed to “it has no resistance.”
  • Is this related to a need for more concreteness?  For example, would it help if we looked up the actual resistance of common types of insulation, or measured it with a megger?  That way we’d have a number to refer to.
  • #3 really stumps me. Is this a way of using “unknown” because they’re thinking of the dielectric as an insulator that is considered “open”, so that #3 is just a special case of #2?  Or is it unknown because the plates are considered to have 0 resistance and the dielectric is considered open, so we “don’t know” the resistance because it’s both at the same time?  The particular student who said that one finds it especially hard to express his reasoning and so couldn’t elaborate when I tried to find out where he was coming from.
  • Why does this come up so often for resistance, and sometimes for current, but I can’t think of a single example for voltage?  I suspect it’s because both resistance and current feel concrete and like real phenomena that they could visualize, so they’re more able to experiment with its meaning.  I think they’re avoiding voltage altogether (first off, it’s about energy, which is weird in the first place, and then it’s a difference of energies, which makes it even less concrete because it’s not really the amount of anything — just the difference between two amounts, and then on top of that we never get to find out what the actual energies are, only the difference between them — which makes it even more abstract and hard to think about).
  • Since this comes up over and over about measurement, is it related to seeing the meter as an opaque, incomprehensible device that might just lie to you sometimes?  If so, this might be a kind of intellectual humility, acknowledging that they don’t fully understand how the meter works.  That’s still frustrating to me though, because we spend time at the beginning of the year exploring how the meter works — so they actually do have the information to explain what inside the meter could show a 0A reading.  Maybe those initial explanations about meters aren’t concrete enough — perhaps we should build one.  Sometimes students assume explanations are metaphors when actually they’re literal causes.
  • Is it related to treating automated devices in general as “too complicated for normal people to understand”?  If that what I’m reading into the situation, it explains why I have weirdly disproportionate irritation and frustration — I’m angry about this as a social phenomenon of elitism and disempowerment, and I assess the success of my teaching partly on the degree to which I succeed in subverting it… both of which are obviously not my students’ fault.

Other Thoughts

One possibility is that they’re actually proposing an idea similar to the database meaning of “null” — something like unknown, or undefined, or “we haven’t checked yet.”

I keep suspecting that this is about a need for more symbols.  Do we need a symbol for “we don’t know”?  It should definitely not be phi, and not the null symbol — it needs to look really different from zero.  Question mark maybe?

If students are not used to school-world tasks where the best answer is “that’s not known yet” or “that’s not measurable with our equipment”, they may be in the habit of filling in the blank.  If that’s the case, having a place-holder symbol might help.

This year, I’ve really started emphasizing the idea that zero, in a measurement, really means “too low to measure”.  I’ve also experimented with guiding them to decipher the precision of their meters by asking them to record “0.00 mA” as “< 5uA”, or whatever is appropriate for their particular meter.  It helps them extend their conceptual fluency with rounding (since I am basically asking them to “unround”); it helps us talk about resolution, and it can help in our conversation about accuracy and error bars.  Similarly,  “open” really means “resistance is too high to measure” (or relatedly, too high to matter) — so we find out what their particular meter can measure and record it as “>X MOhms”.

The downfall there is they start to want to use those numbers for something.  They have many ways of thinking about the “unequal” signs and one of them is to simply make up a number that corresponds to their idea of “significantly bigger”.  For example, when solving a problem, if they’re curious about whether electrons are actually flowing through air, they may use Ohm’s law and plug in 2.5 MOhms for the resistance of air. At first I rolled with it, because it was part of a relevant, significant, and causal line of thinking.  The trouble was that I then didn’t know how to respond when they started assuming that 2.5MOhms was the actual resistance of air (any amount of air, incidentally…), and my suggestion that air might also be 2.0001 MOhms was met with resistance. (Sorry, couldn’t resist). (Ok, I’ll stop…)

I’m afraid that this is making it hard for them to troubleshoot.  Zero current, in particular, is an extremely informative number — it means the circuit is open somewhere.  That piece of information can solve your problem, if you trust that your meter is telling you a true and useful thing. But if you throw away that piece of information as nonsense, it both reduces your confidence in your measurements, and prevents you from solving the problem.

Some Responses I Have Used

“Yes, your meter is showing 0.00 because there is 0.00 A of current flowing through it.”

“Don’t discriminate against zero — it isn’t nothing, it’s something important.  You’ll hurt its feelings!”

Not helpful, I admit!  If inquiry-based learning means that “students inquire into the discipline while I inquire into their thinking”*, neither of those is happening here.

Some Ideas For Next Year

  • Everyone takes apart their meter and measures the current, voltage, and resistance of things like the current-sense resistor, the fuse, the leads…
  • Insist on more consistent use of “less than 5 uA” or “greater than 2MOhms” so that we can practise reasoning with inequalities
  • “Is it possible that there is actually 0 current flowing?  Why or why not?”
  • Other ideas?

*I stole this definition of inquiry-based learning from Brian Frank, on a blog post that I have never found again… point me to the link, someone!

Archives

I’M READING ABOUT