You are currently browsing the category archive for the ‘Abstracting’ category.

Early Warning Signs of Fascism

A local media outlet recently wrote

“Why the constant, often blatant lying? For one thing, it functioned as a means of fully dominating subordinates, who would have to cast aside all their integrity to repeat outrageous falsehoods and would then be bound to the leader by shame and complicity. “The great analysts of truth and language in politics” — writes McGill University political philosophy professor Jacob T. Levy — including “George Orwell, Hannah Arendt, Vaclav Havel — can help us recognize this kind of lie for what it is…. Saying something obviously untrue, and making your subordinates repeat it with a straight face in their own voice, is a particularly startling display of power over them. It’s something that was endemic to totalitarianism.”

How often does this happen in our classrooms?  How often do we require students to memorize and repeat things they actually think are nonsense?  

  • “Heavy things fall at the same speed as light things.” (Sure, whatever.)
  • “An object in motion will stay in motion forever unless something stops it.” (That’s ridiculous.  Everyone knows that everything stops eventually.  Even planets’ orbits degrade.).
  • When you burn propane, water comes out. (Pul-lease.)
  • The answer to “in January of the year 2000, I was one more than eleven times as old as my son William while in January of 2009, I was seven more than three times as old as him” is somehow not, “why do you not know the age of your own kid?

Real conversation I had with a class a few years ago:

Me: what do you think so far about how weight affects the speed that things fall?

Students (intoning): “Everything falls at the same speed.”

Me: So, do you think that’s weird?

Students: No.

Me: But, this book… I can feel the heaviness in my hand.  And this pencil, I can barely feel it at all.  It feels like the book is pulling harder downward on my hand than the pencil is.  Why wouldn’t that affect the speed of the fall?”

Student: “It’s not actually pulling harder.  It just feels that way, but that’s weight, not mass.”

Me: (weeps quietly)

Please don’t lecture me about the physics.  I’m aware.  Please also don’t lecture me about the terrible fake-Socratic-teaching I’m doing in that example dialogue.  I’m aware of that too.  I’m just saying that students often perceive these to contradict their lived experience, and research shows that outside of classrooms, even those who said the right things on the test usually go right back to thinking what they thought before.

And no, I’m not comparing the role of teachers to the role of Presidents or Prime Ministers.  I do realize they’re different.

Should I Conclude Any of These Things?

  1. Students’ ability to fail to retain or synthesize things that don’t make sense to them is actually a healthful and critically needed form of resistance.
  2. When teachers complain about students and “just memorizing what they need for the test and forgetting it after, without trying to really digest the material,” what we are complaining about is their fascism-prevention mechanism
  3. Teachers have the opportunity to be the “warm up,” the “opening act” — the small-scale practice ground where young minds practice repeating things they don’t believe, thinking they can safely forget them later.
  4. Teachers have the opportunity to be the “innoculation” — the small-scale practice ground where young minds can practice “honoring their dissatisfaction” in a way that, if they get confident with it, might have a chance at saving their integrity, their souls, and their democracy.

Extension Problem

Applying this train of thought to the conventional ways of doing corporate diversity training is left as an exercise for the reader.


I heart zero

Here are some conversations that come up every year.

1. Zero Current

Student: “I tried to measure current, but I couldn’t get a reading.”

Me: “So the display was blank?”

Student: “No, it just didn’t show anything.”

(Note: Display showed 0.00)

2. Zero Resistance

Student: “We can’t solve this problem, because an insulator has no resistance.”

Me: “So it has zero ohms?”

Student: “No, it’s too high to measure.”

3. Zero Resistance, In a Different Way

Student: “In this circuit, X = 10, but we write R = 0 because the real ohms are unknown.”

(Note: The real ohms are not unknown.  The students made capacitors out of household materials last week, so they have previously explored that the plates have approx. 0 and the dielectric is considered open)

4. Zero Resistance Yet Another Way

Student: “I wrote zero ohms in my table for the resistance of the battery since there’s no way to measure it.”

What I Wonder

  • Are students thinking about zero as indicator that means “error” or “you’re using the measuring tool wrong?”  A bathroom scale might show zero if you weren’t standing on it.  A gas gauge shows zero when the car isn’t running.
  • When students say “it has none” like in example 2, what is it that there is none of? They might mean “it has no known value”, which might be true, as a opposed to “it has no resistance.”
  • Is this related to a need for more concreteness?  For example, would it help if we looked up the actual resistance of common types of insulation, or measured it with a megger?  That way we’d have a number to refer to.
  • #3 really stumps me. Is this a way of using “unknown” because they’re thinking of the dielectric as an insulator that is considered “open”, so that #3 is just a special case of #2?  Or is it unknown because the plates are considered to have 0 resistance and the dielectric is considered open, so we “don’t know” the resistance because it’s both at the same time?  The particular student who said that one finds it especially hard to express his reasoning and so couldn’t elaborate when I tried to find out where he was coming from.
  • Why does this come up so often for resistance, and sometimes for current, but I can’t think of a single example for voltage?  I suspect it’s because both resistance and current feel concrete and like real phenomena that they could visualize, so they’re more able to experiment with its meaning.  I think they’re avoiding voltage altogether (first off, it’s about energy, which is weird in the first place, and then it’s a difference of energies, which makes it even less concrete because it’s not really the amount of anything — just the difference between two amounts, and then on top of that we never get to find out what the actual energies are, only the difference between them — which makes it even more abstract and hard to think about).
  • Since this comes up over and over about measurement, is it related to seeing the meter as an opaque, incomprehensible device that might just lie to you sometimes?  If so, this might be a kind of intellectual humility, acknowledging that they don’t fully understand how the meter works.  That’s still frustrating to me though, because we spend time at the beginning of the year exploring how the meter works — so they actually do have the information to explain what inside the meter could show a 0A reading.  Maybe those initial explanations about meters aren’t concrete enough — perhaps we should build one.  Sometimes students assume explanations are metaphors when actually they’re literal causes.
  • Is it related to treating automated devices in general as “too complicated for normal people to understand”?  If that what I’m reading into the situation, it explains why I have weirdly disproportionate irritation and frustration — I’m angry about this as a social phenomenon of elitism and disempowerment, and I assess the success of my teaching partly on the degree to which I succeed in subverting it… both of which are obviously not my students’ fault.

Other Thoughts

One possibility is that they’re actually proposing an idea similar to the database meaning of “null” — something like unknown, or undefined, or “we haven’t checked yet.”

I keep suspecting that this is about a need for more symbols.  Do we need a symbol for “we don’t know”?  It should definitely not be phi, and not the null symbol — it needs to look really different from zero.  Question mark maybe?

If students are not used to school-world tasks where the best answer is “that’s not known yet” or “that’s not measurable with our equipment”, they may be in the habit of filling in the blank.  If that’s the case, having a place-holder symbol might help.

This year, I’ve really started emphasizing the idea that zero, in a measurement, really means “too low to measure”.  I’ve also experimented with guiding them to decipher the precision of their meters by asking them to record “0.00 mA” as “< 5uA”, or whatever is appropriate for their particular meter.  It helps them extend their conceptual fluency with rounding (since I am basically asking them to “unround”); it helps us talk about resolution, and it can help in our conversation about accuracy and error bars.  Similarly,  “open” really means “resistance is too high to measure” (or relatedly, too high to matter) — so we find out what their particular meter can measure and record it as “>X MOhms”.

The downfall there is they start to want to use those numbers for something.  They have many ways of thinking about the “unequal” signs and one of them is to simply make up a number that corresponds to their idea of “significantly bigger”.  For example, when solving a problem, if they’re curious about whether electrons are actually flowing through air, they may use Ohm’s law and plug in 2.5 MOhms for the resistance of air. At first I rolled with it, because it was part of a relevant, significant, and causal line of thinking.  The trouble was that I then didn’t know how to respond when they started assuming that 2.5MOhms was the actual resistance of air (any amount of air, incidentally…), and my suggestion that air might also be 2.0001 MOhms was met with resistance. (Sorry, couldn’t resist). (Ok, I’ll stop…)

I’m afraid that this is making it hard for them to troubleshoot.  Zero current, in particular, is an extremely informative number — it means the circuit is open somewhere.  That piece of information can solve your problem, if you trust that your meter is telling you a true and useful thing. But if you throw away that piece of information as nonsense, it both reduces your confidence in your measurements, and prevents you from solving the problem.

Some Responses I Have Used

“Yes, your meter is showing 0.00 because there is 0.00 A of current flowing through it.”

“Don’t discriminate against zero — it isn’t nothing, it’s something important.  You’ll hurt its feelings!”

Not helpful, I admit!  If inquiry-based learning means that “students inquire into the discipline while I inquire into their thinking”*, neither of those is happening here.

Some Ideas For Next Year

  • Everyone takes apart their meter and measures the current, voltage, and resistance of things like the current-sense resistor, the fuse, the leads…
  • Insist on more consistent use of “less than 5 uA” or “greater than 2MOhms” so that we can practise reasoning with inequalities
  • “Is it possible that there is actually 0 current flowing?  Why or why not?”
  • Other ideas?

*I stole this definition of inquiry-based learning from Brian Frank, on a blog post that I have never found again… point me to the link, someone!

Some interesting comments on my recent post about causal thinking have got my wheels turning.  It puts me in mind of the conversation at Overthinking My Teaching about whether “repeated addition” is the best way to approach teaching exponents. In that post, Christopher Danielson points out the helpfulness of shifting from “Why is Approach X  wrong” or even “Which approach is correct” toward “What is gained and lost when using Approach X?

In that light, I’m thinking back on my post and the comments.  For example:


I talk about the difference between “who/what you are” (the definition of you) and “what caused you” (a meeting of sperm and egg).  In the systems of belief that my students tend to have, people are not thought to “just happen” or “cause themselves.”  It can help open the conversation.  However, even when I do this, they are surprisingly unlikely to transfer that concept to atomic particles.


“Purpose is a REAL facet in all of nature because everything has a natural function e.g., the role of mitochondria in eukaryotic cells is ATP production, or that the nature of negatively charged electrons is to attract and repel + and – charged particles respectively, etc.”


But I think it’s the same mistake to presume that they really *mean* that the electron has desires and wants, which is a slippery slope to thinking they *can’t* access or feel the need to explore the deeper causal relationships.

I’m noticing that there are ideas I expect students to extend from humans to particles (forces can act on us), and ideas I expect them to find not-extensible (desire).  These examples are the easy ones; “purpose” is harder to place clearly in one category or the other, and “cause” probably belongs in both categories but means something different in each.  I need to think more clearly about which ones are which and why, and how to help students develop their own skills for distinguishing.

I’m trying to stop assuming that when students talk about electrons’ “desires,” that they are referring to a deeper story; I also need to avoid assuming that they are not, or that they don’t want to/aren’t drawn to.

I’m on a personal “fast” of discussing electrons’ purposes and desires, at least while I’m in earshot of my students.  It’s hard to break those habits, exactly because they are so helpful.  However, it has the useful result that all the ideas about purpose and desires that are getting thrown around in class come from the students.  The students seem more willing to question them than when the ideas come from me.  Unfortunately they are having a really hard time understanding each other’s metaphors (even though the metaphors are not particularly far-fetched, by my reckoning), and I’m having a really hard time facilitating the conversation to help them see each other’s point of view.  But that still seems better than before, when the metaphors were not getting questioned at all, and maybe not even noticed as metaphors.

My students have recently discovered the convention of describing silicon diodes as having a forward voltage of 0.7 V.  They know that this is not always true — or even usually true, in their experience.  The way they reconciled the difference made for an interesting conversation about abstraction — the verb, not the noun.

After some constructive class discussion about possible approaches, they decided to use the diode’s “turn-on voltage” in predictions.  That’s the smallest voltage at which measurable current will flow — for a rectifier diode using our meters, it’s about 400 mV.  It’s also the voltage that, subtracted from the supply, gives the highest estimate for voltage across the other components and therefore the highest estimate of current.  They thought a high current was the “worst-case scenario” in terms of protecting the diode from damage.  When it turned out in the lab that this made their percent differences unusually high, they were willing to sacrifice accuracy for safety.

So why do authoritative sources say that all silicon diodes have a forward voltage of 0.7 V?  Except for the ones that say it is definitely always 0.6 V?

The students shared their confusion and no small amount of anger.  The problem wasn’t with having chosen some constant value; they got that you had to pick a value to work with when making predictions.  The problem wasn’t the need to abstract information out of the picture; they discussed several reasonable approaches to that problem and chosen one based on their evidence and judgement. Their problem was with sources that never mentioned that a choice had been made at all.

They were irritated, considering this at best a “mistake” and at worst a “lie.”  As I often do, I asked the students “why would a reasonable textbook author do this?”  Here are their answers:

It could be a typo.

It could be a shortcut for the author’s convenience.

Maybe they learned it that was so they put it in their textbook that way.

Maybe the authors are so experienced that they forgot that they made an assumption.

When the students ran out of ideas, I contributed mine: that the author had done this deliberately to make things simpler for students.  They were stunned.  How could anyone think it would be easier to have a “fact” printed in the textbook that was clearly contradicted by their measurements?  How could anyone not realize that it made them doubt their skill, even their perception of reality?  They were describing feeling “gaslit.”

I confess that I was delighted.  It marks a shift in their thinking about science: away from judging reality according to how well it fits their predictions, toward judging predictions according to how well they model reality.  And yes, I called it the “second diode approximation,” and warned them that they would encounter the first and third approximations as well.

But mostly, I was sad about how consistently teaching materials do this. The fact that an abstraction has an official name is not a justification for introducing it first in a curriculum.  I am more and more sure that my students understand more when we start from complexity as we experience it, then move toward idealized concepts only if they help us get closer to a goal.

Brian Frank gives a bunch of examples and helpful exercises for (current or) aspiring teachers, including this quote:

The shortcuts, omissions, and ‘simplifications’, which are meant to reduce complexity are not conducive to understanding; they are specious, and they make genuine understanding extremely difficult. (Arons, “Teaching Introductory Physics”, pg. 24)

Will this always be true?  If not, how could I distinguish  contexts in which it would help to go the other way?  What else can I do to “inoculate” students against these approaches when they inevitably encounter them?