You are currently browsing the category archive for the ‘Causality’ category.

The first-year students are solving series circuits and explaining what’s happening.  Most are able to connect their answers and thoughts to evidence we’ve gathered this semester.  But most are struggling with the questions about causality.

For each effect they describe mathematically, I ask them to explain what is physically capable of causing that effect. Or, they can choose to explain why the result seems like it can’t be happening.   It doesn’t have to be cannonical, but it must be internally consistent, not circular, and supported by our evidence.  They are struggling most with explaining Kirchhoff’s Voltage Law.  This is understandable — I don’t think I could explain it heuristically either.  However, only one student took the opportunity to say why it doesn’t make sense.

We’ve done lots of practise writing cause statements.  They know what “begging the question” means.  I’ve modelled, and we’ve practised, the importance of saying “I don’t know” when that’s the most accurate thing we can say. Examples of student thinking are below.

I’m tempted to propose a taxonomy of acausal strategies.  Which examples of student thinking do you think fit where?  Would you add or remove categories?   Could you propose some pithy names for them?

  1. It does that because it’s designed to do that
  2. It does that because if it didn’t, this other important thing wouldn’t happen
  3. It does that because there’s a law that says it has to do that
  4. It does that because it does that (begging the question)
  5. It does that just because

My questions are:

“An electron has to use up all its energy that it gets from the battery.  This is caused because if all of the energy wasn’t used, the circuit wouldn’t give accurate results, or work properly.”

“When electrons pass through a component, that causes them to lose energy.  The electrons would have to be able to flow through the circuit in order to keep the current and battery functioning.”

“An electron has to use up all the energy it gets from the battery.  This is caused because if the voltage from the power source is 5V, the electrons have to use up all of their energy, in this case they use up all of it in the resistor (except for the little energy used in the switch).”

“The electrons always use up exactly the energy they gain in the battery because of conservation of energy.”

“It doesn’t make sense that if there’s only one component in the circuit, it always uses up exactly the battery’s voltage.  A higher resistor should be like a steeper hill — harder for the electrons to get past, and requiring more energy.”

How are these students thinking about causality?

What should I ask next?

“Electrical charge is caused due to the movement of electrons from atom to atom.”

“The appearance and properties of atoms are changed cause protons are added or removed from it.”

“Atoms are the basic building block of matter because all matter contains atoms.”

“Atoms are electrons, protons, and neutrons and are bound together by magnetic forces.”

“Electrons excess makes charge negative, while protons excess makes charge positive.  Why are these the charges?”

“Electrons cancel out protons because of the protons’ positive charge.”

“Electrons likely move so slow due to the difficulty of exerting force on them.”

“Electrons in motion cause excess energy called tails.”

“When electrons are further away it causes them to have higher energy levels.”

“The positive parts ‘want’ electrons because they are oppositely charged and so they are attracted to each other.”

“A photon absorbed by an electron causes it to escape from the atom.”

“What causes charge to never be created or destroyed?”






Some interesting comments on my recent post about causal thinking have got my wheels turning.  It puts me in mind of the conversation at Overthinking My Teaching about whether “repeated addition” is the best way to approach teaching exponents. In that post, Christopher Danielson points out the helpfulness of shifting from “Why is Approach X  wrong” or even “Which approach is correct” toward “What is gained and lost when using Approach X?

In that light, I’m thinking back on my post and the comments.  For example:


I talk about the difference between “who/what you are” (the definition of you) and “what caused you” (a meeting of sperm and egg).  In the systems of belief that my students tend to have, people are not thought to “just happen” or “cause themselves.”  It can help open the conversation.  However, even when I do this, they are surprisingly unlikely to transfer that concept to atomic particles.


“Purpose is a REAL facet in all of nature because everything has a natural function e.g., the role of mitochondria in eukaryotic cells is ATP production, or that the nature of negatively charged electrons is to attract and repel + and – charged particles respectively, etc.”


But I think it’s the same mistake to presume that they really *mean* that the electron has desires and wants, which is a slippery slope to thinking they *can’t* access or feel the need to explore the deeper causal relationships.

I’m noticing that there are ideas I expect students to extend from humans to particles (forces can act on us), and ideas I expect them to find not-extensible (desire).  These examples are the easy ones; “purpose” is harder to place clearly in one category or the other, and “cause” probably belongs in both categories but means something different in each.  I need to think more clearly about which ones are which and why, and how to help students develop their own skills for distinguishing.

I’m trying to stop assuming that when students talk about electrons’ “desires,” that they are referring to a deeper story; I also need to avoid assuming that they are not, or that they don’t want to/aren’t drawn to.

I’m on a personal “fast” of discussing electrons’ purposes and desires, at least while I’m in earshot of my students.  It’s hard to break those habits, exactly because they are so helpful.  However, it has the useful result that all the ideas about purpose and desires that are getting thrown around in class come from the students.  The students seem more willing to question them than when the ideas come from me.  Unfortunately they are having a really hard time understanding each other’s metaphors (even though the metaphors are not particularly far-fetched, by my reckoning), and I’m having a really hard time facilitating the conversation to help them see each other’s point of view.  But that still seems better than before, when the metaphors were not getting questioned at all, and maybe not even noticed as metaphors.

Michael Pershan kicked my butt recently with a post about why teachers tend to plateau in skill after their third year, connecting it to Cal Newport’s ideas such as “hard practice” (and, I would argue, “deep work“).

Michael distinguishes between practice and hard practice, and wonders whether blogging belongs on his priority list:

“Hard practice makes you better quickly. Practice lets you, essentially, plateau. …

Put it like this: do you feel like you’re a 1st year teacher when you blog? Does your brain hurt? Do you feel as if you’re lost, unsure how to proceed, confused?
If not, you’re not engaged in hard practice.”

Ooof.  On one hand, it made me face my desire to avoid hard practice; I feel like I’ve spent the last 8 months trying to decrease how much I feel like that.  I’ve tried to create classroom procedures that are more reuseable and systematic, especially for labs, whiteboarding sessions, class discussions, and model presentations.

It’s a good idea to periodically take a hard look at that avoidance, and decide whether I’m happy with where I stand.  In this case, I am.  I don’t think the goal is to “feel like a first year teacher” 100% of the time; it’s not sustainable and not generative.  But it reminds me that I want to know which activities make me feel like that, and consciously choose some to seek out.

Michael makes this promise to himself:

It’s time to redouble my efforts. I’m half way through my third year, and this would be a great time for me to ease into a comfortable routine of expanding my repertoire without improving my skills.

I’m going to commit to finding things that are intellectually taxing that are central to my teaching.

It made me think about what my promises are to myself.

Be a Beginner

Do something every summer that I don’t know anything about and document the process.  Pay special attention to how I treat others when I am insecure, what I say to myself about my skills and abilities, and what exactly I do to fight back against the fixed-mindset that threatens to overwhelm me.  Use this to develop some insight into what exactly I am asking from my students, and to expand the techniques I can share with them for dealing with it.

Last summer I floored my downstairs.  The summer before that I learned to swim — you know, with an actual recognizable stroke.  In both cases, I am proud of what I accomplished.  In the process, I was amazed to notice how much concentration it took not to be a jerk to myself and others.

Learn More About Causal Thinking

I find myself being really sad about the ways my students think about causality.  On one hand, I think my recent dissections of the topic are a prime example of “misconceptions listening” — looking for the deficit.  I’m pretty sure my students have knowledge and intuition about cause that I can’t see, because I’m so focused on noticing what’s going wrong.  In other words, my way of noticing students’ misconceptions is itself a misconception.  I’d rather be listening to their ideas fully, doing a better job of figuring out what’s generative in their thinking.

What to do about this? If I believe that my students need to engage with their misconceptions and work through them, then that’s probably what I need too. There’s no point in my students squashing their misconceptions in favour of “right answers”; similarly, there’s no point in me squashing my sadness and replacing it with some half-hearted “correct pedagogy.”

Maybe I’m supposed to be whole-heartedly happy to “meet my students where they are,” but if I said I was, I’d be lying. (That phrase has been used so often to dismiss my anger at the educational malpractice my students have endured that I can’t even hear it without bristling).  I need to midwife myself through this narrow way of thinking by engaging with it.  Like my students, I expect to hold myself accountable to my observations, to good-quality reasoning, to the ontology of learning and thinking, and to whatever data and peer feedback I can get my hands on.

My students’ struggle with causality is the puzzle from which my desire for explanation emerged; it is the source of the perplexity that makes me unwilling to give up. I hope that pursuing it honestly will help me think better about what it’s like when I ask my students to do the same.

Interact with New Teachers

Talking with beginning teachers is better than almost anything else I’ve tried for forcing me to get honest about what I think and what I do.  There’s a new teacher in our program, and talking things through with him has been a big help in crystallizing my thoughts (mutually useful, I think).  I will continue doing this and documenting it.  I also put on a seminar on peer assessment for first-year teachers last summer; it was one of the more challenging lesson plans I’ve ever written.  If I have another chance to do this, I will.

Work for Systemic Change

I’m not interested in strictly personal solutions to systemic problems.  I won’t have fun, or meet my potential as a teacher, if I limit myself to improving me.  I want to help my institution and my community improve, and that means creating conditions and communities that foster change in collective ways.  For two years, I tried to do a bit of this via my campus PD committee; for various reasons, that avenue turned out not to lead in the directions I’m interested in going.  I’ve had more success pressing for awareness and implementation of the Workplace Violence Prevention regulations that are part of my local jurisdiction’s Occupational Health and Safety Act.

I’m not sure what the next project will be, but I attended an interesting seminar a few months ago about our organization’s plans for change.  I was intrigued by the conversations happening about improving our internal communication.  I’ve also had some interesting conversations recently with others who want to push past the “corporate diversity” model toward a less ahistorical model of social justice or cultural competence.  I’ll continue to explore those to find out which ones have some potential for constructive change.

Design for Breaks

I can’t do this all the time or I won’t stay in the classroom.  I know that now.  As of the beginning of January, I’ve reclaimed my Saturdays.  No work on Saturdays.  It makes the rest of my week slightly more stressful, but it’s worth it.  For the first few weeks, I spent the entire day alternately reading and napping.  Knowing that I have that to look forward to reminds me that the stakes aren’t as high as they sometimes seem.

I’m also planning to go on deferred leave for four months starting next January.  After that, I’ve made it a priority to find a way to work half-time.   The kind of “intellectually taxing” enrichment that I need, in order for teaching to be satisfying, takes more time than is reasonable on top of a full-time job.  I’m not willing to permanently sacrifice my ability to do community volunteer work, spend time with my loved ones, and get regular exercise. That’s more of a medium-term goal, but I’m working a few leads already.

Anyone have any suggestions about what I should do with 4 months of unscheduled time starting January 2014?

My students use the same assessment rubric for practically every new source of information we encounter, whether it’s something they read in a book, data they collected, or information I present directly.  It asks them to summarize, relate to their experience, ask questions, explain what the author claims is the cause, and give support using existing ideas from the model.  The current version looks like this (click through to zoom or download):

Assessment for Learning

There are two goals:

  • to assess the author’s reasoning, and help us decide whether to accept their proposal
  • to assess one’s own understanding

If you can’t fill it in, you probably didn’t understand it.  Maybe you weren’t reading carefully, maybe it’s so poorly reasoned or written that it’s not actually understandable, or maybe you don’t have the background knowledge to digest it.  All of these conditions are important to flag, and this tool helps us do that.

The title says “Rubric for Assessing Reasoning,” but we just call them “feedbacks.”

Recently, there have been a spate of feedbacks turned in with the cause and/or the “support from the model” section left blank or filled with vague truisms (“this is supported by lots of ideas about atoms,” or “I’m looking forward to learning more about what causes this.”)

I knew the students could do better — all of them have written strong statements about cause in the past (in chains of cause and effect 2-5 steps long).  I also allow students to write a question about cause, instead of a statement, if they can’t tell what the cause is, or if they think the author hasn’t included it.

So today, after I presented my second draft of some information about RMS measurements, I showed some typical examples of causal statements and supporting ideas.  I asked students to rate them according to their significance to the question at hand, then had some small group discussions.  I was interested (and occasionally surprised) by their criteria for what makes a good statement of cause, and what makes a good supporting idea.  Here’s the handout I used to scaffold the discussions.

The students’ results:

A statement of cause should …

  • Be relevant to the question
  • Help us understand the question or the answer
  • Not leave questions unanswered
  • Give lots of info
  • Relate to the model
  • Explain what physically makes something happen or asks a question that would help you understand the physical cause
  • Help you distinguish between similar things (like the difference between Vpk, Vpp, Vrms)
  • Not beg the question (not state the same thing twice using different words)
  • Be concrete
  • Make the new ideas easier to accept
  • Use definitions

Well, I was looking for an excuse to talk about definitions — I think this is it!

Supporting ideas from the model should…

  • Help clarify how the electrons work
  • Help answer or clarify the question
  • Directly involve information to help relate ideas
  • Help us see what is going on
  • Give us reasoning so we can in turn have an explanation
  • Clarify misunderstandings
  • Allow you to generalize
  • Support the cause, specifically.
  • Be specific to the topic, not broad (like, “atoms are made of protons, electrons, and neutrons.”)
  • Not use a formula
  • It helps if you understand what’s going on, it makes it easier to find connections

The Last World

Which ones would you emphasize? What would you add?

Overheard while the students discussed the difference between I vs. V characteristics of light bulbs and diodes.


Facilitating the process:

What else do we know?

Are we going to analyze predictions and measurements?  Or just measurements?

So forward voltage is one category, reverse is another?

So, what have we concluded so far?

Do we have to write down our data?

I’m going to keep writing down the data.

So basically what you had was…

Were you maybe reading it like…

So what should we put here?


Seeking Causes:

But it wouldn’t be through the LED.  The voltmeter was shorting out the LED.

So they’re about the same, what’s the reason for that?


Holding our thinking to the model:

So this is actually supporting our idea…

One thing I noticed was that as voltage increased, current increased

I thought it always had all the voltage right there.

The current is supposed to go up, according to predictions.


Seeking patterns

Was VR1 always 0?

So forward voltage is one category, reverse is another?

Do you have the same figures for positive and negative voltage? [Reply] Well, let’s compare.

So they’re about the same, what’s the reason for that?

I think there’s something wrong there.

So we can’t compare these to each other.

What I did was use Ohm’s Law, that you have to do that for each point individually.

I think the resistance will decrease because…

Diodes are crazy!

It probably works like a switch.

Previously, in data analysis sessions since September:

Students were having trouble drawing any conclusions, noticing any patterns, or thinking about cause at all when they broke into groups to analyze data the class had generated.  It was never enough time, had always been too long since the measurements were taken, and they had too little background knowledge.  They floundered and fussed, getting increasingly annoyed and disoriented, while I tried to make them think by sheer force of will, running around steering them away from insignificant details (like, “all the voltages are even numbers”).

Lesson #1: Procedural Fluency

This semester started off the same.  I asked them to characterise the I vs. V response of a lightbulb and of an LED, to look for similarities and differences.  As I wrote previously, most of them were up to their gills just trying to wrestle their measurement equipment into submission.  They finally complete their measurements, but without any awareness of what was going on, what that graph mean, etc.

At my wits’ end, I had them do it again with two other models of diode.  To me, this felt almost punitive — like handing someone an identical worksheet and telling them to start over.  To try to make it a bit more palatable, I seized on their frustration about how “Mylène’s labs are so LONG” and told them we weren’t going to cover anything new — we were just going to do some speed practice, so I could show them some techniques for increasing speed without sacrificing accuracy.

I helped them strategize about how to set up a table for measurements (they were writing their measurements out in paragraphs… yikes).  I also got much more directive than usual, and informed them that everyone was required to used two meters simultaneously (many were using a single meter to switch back and forth between measuring voltage and current… with attendant need to unhook the circuit TWICE for every data point!!).  There was big buy-in for this, as they immediately saw that they were going to get an entire data set in a single class.  I saved a few minutes at the end of class for students to share their own time-saving ideas with their classmates.

What I didn’t realize was that they had internalized so little information about diodes that blue LEDs seemed like a whole different project than red LEDs.  I was worried they would mutiny about being forced to redo something they’d already finished, but I was wrong.  They welcomed, with relief, the opportunity to do something that was recognizable, with a format and a set of instructions that they had already worked the kinks out of.  Moral of the story: it’s the background knowledge, stupid.  (I can hear Jason Buell‘s voice in my head all the time now).

Lesson #2: Distributed Practice

I also realized that asking this group to sit down with some data and analyze the patterns in an hour is not going to happen.  I figured it mostly about having enough time (and not feeling pressured) so I started requiring them to keep track of “what did you notice?  what did you wonder?” while they were measuring.  After they were done measuring, I also required them to write some notes to themselves: explanations of anything in the lab that supported the model, and questions about anything that wasn’t supported by the model or that seemed weird (“When you find some funny, measure the amount of funny.” [Bob Pease of National Semiconductor, probably apocryphal]).

That meant they could take their time, tease out their thoughts, and write down whatever they noticed.  When it was time to sit down in data analysis session, they had already spent some time thinking about what was significant in their measurements.  They had also documented it.

Lesson #3: Expect them to represent their own data

In the past, I’ve made a full record of the class’s data and given a copy to every students.  My intention was that they would come through the evidence in a small group — maybe splitting up the topics (“you look at all the red LEDs — do they all turn on at 1.7?  I’ll check the blue ones”) — and everyone would be able to engage with the conversation, no matter whose data we were discussing.  My other intention was that they would take better notes if they knew other students would read them.  It worked last year … but this year I got extremely tidy notes, written out painstakingly slowly so the writing was legible… with measurements buried in paragraphs.

Last week, I asked everyone to get into small groups with people who were not their lab partner.  They were not required to analyze the whole class’s data — only the data of the people in the small group, who would be expected to explain it to the others.

The students loved it because they were analyzing 4 data sets, not 9.  So they were happy.  I was happy too, because, from out of nowhere, the room exploded in a fury of scientific discourse.  “Oh?  I got a different number.  How did you measure it?”  “Does everybody have…?” “Will it always be…?” “Why wouldn’t it…?” “That’s what we’d expect from the model, because…

I was floored.  Since I didn’t have to run around putting out fires, I found my brain magically tuned in to their conversations — I filled an entire 8.5×11 sheet full of skillful argumentation and evidence-based reasoning that I overheard.  Honestly, I didn’t hear a single teleological, unscientific, or stubbornly antagonistic comment.    Most days I can’t do this at all — I’m too overwhelmed to hear anything but a buzzing cacophony, and they’re too tense to keep talking when I get close.They didn’t even stop talking when I wandered near their desks — they were all getting their foot in the door, making sure their data made the final cut.

It slowed down a bit when I reminded them that they had to have at least one possible physical cause for anything they proposed (i.e. “the materials and design of the diode cause it to not conduct backwards” is not a cause).  But they picked it back up, with awesome ideas like

  • Maybe the diode acts like a capacitor — it stores up a certain amount of energy
  • Maybe the diode only takes whatever energy it needs to light up, and then it doesn’t take any more
  • Maybe the lightbulb’s resistance went up because it’s a very narrow filament, but it has low resistance.  So when all the current rushes in, there’s no room for more electrons, and that restricts current.
  • Maybe a diode has a break inside, and it takes a certain amount of voltage to push the electrons through the gap.  It’s like shooting electrons out of a cannon — they need a certain force to make it over a ravine.
  • How come electrons in a silicon crystal “bond” and make a pair?  I thought they orbit around the nucleus because electrons repel each other.
  • If a leaving electron creates a positive ion, wouldn’t that attract the same electron that left?

These are not canonical, of course.  But they’re causes!  And questions!   And they have electrons!!  I was so excited.  The students were having fun too — I can tell because when they’re having fun, they like to make fun of me (repeating my stock phrases, pretending to draw from a deck of cards to cold call someone in the audience, etc etc.)

Moral of the story

1. During measurement, you must write down what you noticed, what you wondered/didn’t know.

2. After measurement, you must write down which parts of this the model can explain (students call this “comparing the data to the model.”)  This causes students to actually pull out the model and read it.  Awesome.

3. Anything that can’t be explained by the model?  Articulate a question about it.

4.  If that’s still not working well, and I’m still getting into a battle of wills with students who say that the model doesn’t explain anything about diodes, do the same lab again.  Call it speed practice.

Then, when we share data and propose new ideas to the model, they’ve already spent some time thinking about what’s weird (no reverse current in a diode), what’s predicted surprisingly well by the model (forward current in a diode) and what’s predicted surprisingly badly (current in a lightbulb).  When we sit down to analyze the data, they’re generating those ideas for the second or third time, not the first.

5. Stop making copies of everyone’s data — it allows one strong and/or bossy student to do all the analyzing.  Require that the whiteboards include an example from every person’s data.

6. Watch while they jump in to contribute their own data, compare results and ideas about “why,” facilitate each others’ participation, summarize each others’ contributions, challenge, discuss, and pick apart their data according to the model.

7.  Realize that since I’m less overwhelmed with needing to force them to contribute constructively, I too have much more cognitive capacity left over for listening to the extremely interesting conversations.

How can I help students make causal thinking a habit?  I’ve written before about my struggles helping students “do cause” consistently, and distinguishing between “what made it happen” vs. “what made me think it would happen.”  Most recently, I wrote about how using a biological model of the growing brain might help develop the skills needed to talk about a physical model of atomic particles.

Sweng1948 commented that cause and definition become easy to distinguish when we talk about pregnancy, and seemed a little concerned that it would come off as flippant.  To me, it doesn’t — especially because I use that example all the time. Specifically, I talk about the difference between “who/what you are” (the definition of you) and “what caused you” (a meeting of sperm and egg).  In the systems of belief that my students tend to have, people are not thought to “just happen” or “cause themselves.”  It can help open the conversation.  However, even when I do this, they are surprisingly unlikely to transfer that concept to atomic particles.

Biology Vs. Physics

My students seem to regard cause differently in biology vs. physics.  They are likely to say that eating poorly causes malnutrition and eating well contributes to causing good health; they are less likely to say that the negative charge of electrons causes them to move apart, and more likely to say that electrons move apart because they’re electrons, and that’s what electrons do.

Further, once they conclude that moving two electrons apart causes their repulsion to weaken, they are unable to decide whether moving them closer together strengthens it (I have no idea what to do about this).  It’s also often opaque to students whether one electron is repelling the other, or the second one is repelling the first. This happens in various contexts: the other day, a student presented the idea that cooling a battery would lower its voltage.  Several students were frustrated because they had asked what would raise a battery’s voltage, not what would lower it, and were a bit aggressive in telling the presenting student that he had not answered their question.

That’s one of the reasons I was interested in using this “brain” model as a way to open the conversation about causality and models in general; they do cause better with biology.  I’ll have to figure out next year how to build a bridge between cells and atoms…

I’m not sure why it’s so difficult.  Here are a few stabs at it:

  1. Is it because they see causality as connected to intention — in other words, you are only causing things if you do them on purpose?
  2. Does their experience of their own conscious agency helps them see how their choices are causes that have demonstrable effects — such that things that don’t have choices also seem not to cause things?
  3. Is it because living things are easier to see and relate to than electrons?
  4. Is it because they see cause as inextricably linked to desire?   Something like, “What caused me to buy a bag of candy is that I wanted it. So, electrons must move because they want to.”

I sometimes fool myself into thinking that my students have understood some underlying principle when they anthropomorphize particles and forces: “The electron wants to move toward the proton.” “Voltage is when an electron is ready to move to another atom.”  I assume that they are constructing a metaphor to symbolize what’s going on, or using a verbal shorthand.  Then I realize, many students don’t think of the electron’s “desire” as a metaphor, and can’t connect this to ideas about energy, charge, etc. Consider this my plea to K-12 teachers not to say that stuff, and when students bring it up, to engage with them about what exactly that means.  Desires are things we can use willpower to sublimate.  Forces, not so much.  That’s why it’s called force.

Something about cause leads to students treating particles (and, for that matter, compilers and microprocessors) as if they, like people, might act the way we expect, but they also might not.  I can’t tell whether it’s because there could be an opposing force, or “just because.”  If it’s the former, then there’s a kernel of intellectual humility here that I respect: a sort of submission to the possibility that there are forces we don’t understand, and our model will only work if there are no opposing forces we haven’t accounted for.  However, I often can’t find out whether they’re talking/thinking about science or faith, because the responses to my questions are often defensive, along the lines of “My physics teacher said it’s complicated.  The reason they didn’t teach it to us in high school is that it’s just too hard for anyone to learn, unless they’re a theoretical physicist.” (*sigh*. Hoping the growth-mindset ideas will help with this).

We Can’t Understand It Fully, So There’s No Point

Also, the “we don’t understand it fully” shrug seems to be anti-generative: it leads to an intellectual abdication.  It’s a defence against the idea that we should just go ahead and use our model to make predictions, then test the predictions to find the holes in the model.  Or maybe I’ve got it backwards — maybe the intellectual abdication causes the shrug.  I’m back to growth mindset again, but not about growing ourselves — growth mindset for the model too!  Fixed mindset says there’s no point making a prediction that might be wrong.  Only a growth mindset sees the value in testing a prediction with the intention of helping the model (and ourselves) get stronger.

I expect that the word “potential” is part of the problem here (as in, potential difference and potential energy) — to my students, “potential” means something that you need to make a decision about. They say that they will “potentially” go to the movies that night, which means they haven’t chosen yet.  By that logic, if you have a “potential difference”, that means there might be a difference, but there might not, too. Depending on what the electron decides.  Potential energy?  Maybe you’ve got (or will later have) energy, maybe you don’t.  What’s strong about this thinking is that they’re right that there’s something that “might or might not” happen (current, acceleration, etc.).  What’s frustrating is that I don’t know how to help them unpack the difference between a “force” and a “decision” in a way that actually helps.

(And no, the connections to the uncertainty principle, the observer effect, the unpredictability of chaotic systems, and the challenges to causality posed by modern physics are not lost on me… but I’d rather my students work through “wrong” conclusions via confidence in reasoning, than come to some shadow of the “right” conclusions via an assumption of their own intellectual inadequacy.)

I went looking for a resource about “growth mindset” that I could use in class, because I am trying to convince my students that asking questions helps you get smarter (i.e. understand things better).  I appreciate Carol Dweck‘s work on her website and her book, but I don’t find them

  • concise enough,
  • clear enough, or
  • at an appropriate reading level for my students.

What I found was Diana Hestwood and Linda Russel’s presentation about “How Your Brain Learns and Remembers.”  The authors give permission for non-profit use by individual teachers.  It’s not perfect (I edited out the heading that says “You are naturally smart” … apologies to the authors) and it’s not completely in tune with some  of the neuroscience research I am hearing about lately, but it meets my criteria (above) and got the students thinking and talking.

Despite her warning that it’s not intended to stand on its own and that the teacher should lead a discussion, I’d rather poke my eyes out than stand in front of the group while reading full paragraphs off of slides. I found the full-sentence, full-paragraph “presentation” to work on its own just fine (CLARIFIED: I removed all the slides with yellow backgrounds, and ended at slide 48).  I printed it, gave it to the students, and asked them to turn in their responses to the questions embedded in it.  I’ll report back to them with some conversational feedback on their individual papers and some class time for people to raise their issues and questions — as usual, discussion after the students have tangled with the ideas a bit.

The students really went for it.  They turned in answers that were in their own words (a tough ask for this group) and full of inferences, as well as some personal revelations about their own (good and bad) learning experiences.  There were few questions (the presentation isn’t exactly intended to elicit them) but lots of positive buzz.  About half the class stayed late, into coffee break, so they could keep writing about their opinions of this way of thinking.  Several told me that “this was actually interesting!”  (*laugh*)  I also got one “I’m going to show this to my girlfriend” and one, not-quite-accusatory but clearly upset “I wish someone had told me this a long time ago.”  (*gulp*)

I found a lot to like in this presentation.  It’s a non-threatening presentation of some material that could easily become heavily technical and intimidating.  It’s short, and it’s got some humour.  It’s got TONS of points of comparison for circuits, electronic signal theory, even semiconductors (not a co-incidence, obviously).  Most importantly, it allows students to quickly develop causal thinking (e.g. practice causes synapses to widen).

Last year I found out in February that my students couldn’t consistently distinguish between a cause and a definition, and trying to promote that distinction while they were overloaded with circuit theory was just too much.  So this year I created a unit called “Thinking Like a Technician,” in which I introduced the thinking skills we would use in the context of everyday examples. Here’s the skill sheet — use the “full screen” button for a bigger and/or downloadable version.

It helped a bit, but meant that we spend a couple of weeks talking about roller coasters, cars, and musical instruments.  Next year, this is what we’ll use instead.  It’ll give us some shared vocabulary for talking about learning and improving — including why things that feel “easy” don’t always help, why things that feel “confusing” don’t mean you’re stupid, why “feeling” like you know it isn’t a good test of whether you can do it, and why I don’t accept “reviewing your notes” as one of the things you did to improve when you applied for reassessment.

But this will also give us a rich example of what a “model” is, why they are necessarily incomplete and at least a bit abstracted, and how they can help us make judgement calls.  Last year, I started talking about the “human brain model” around this time of the year (during a discussion of why “I’ll just remember the due date for that assignment” is not a strong inference).  That was the earliest I felt I could use the word “model” and have them know what I meant — they were familiar enough with the “circuits and electrons model” to understand what a model was and what it was for.  Next year I hope to use this tool to do it the other way around.

The past semester has been a tough slog with my first-year class.   I’m slowly figuring out what resources and approaches were missing.  Last year, I launched myself headfirst (and underprepared) into inquiry-based learning because most of the class members were overflowing with significant, relevant questions.

This year, the students are barely asking questions at all, and when they do, the questions are not very relevant — they don’t help us move forward toward predicting circuit behaviour, troubleshooting, or any of the other expressed goals we’ve discussed as a class. They’re mostly about electrical safety which, don’t get me wrong, is important, but talking about how people do and don’t get electrocuted has limited value in helping us understand amplifiers.  I felt like I juiced those questions as much as I could, but it only led to more questions about house wiring and car chassis.

If I’m serious about inquiry-based learning, I have to develop a set of tools that allow me to adapt to the group.  Right now I feel like my approach only works if the group is already fairly skills at distinguishing between what we have evidence for and what we just feel like we’ve heard before, and asking significant questions that move toward a specific goal.  In other words, I wasn’t teaching them to reason scientifically, I was filtering out those who already knew from those who didn’t.  Here are some of the things I need to be more prepared for.

Measurement technique

I have never had so much trouble getting students to use their meters correctly.  Here we are in second semester, and I still have students confidently using incorrect settings.  I’d be happier if they were unsure, or had questions, but no, many are not noticing that they have problems with this.  And I don’t mean being confused about whether you should measure 1.5V on the 20V or the 2000 mV setting… I mean measuring 0.1 Ohms on the 200 KOhm setting.

I switched this year to teaching them about current first, rather than resistance (like I did last year).  I’m loathe to reconsider because current is the only one that lends itself to causal thinking and sense-making early in the year (try explaining resistance to someone who doesn’t know what current is… and “electric potential,” to someone who doesn’t know anything formal about energy or force or fields, is just hell).  Could this be part of why they’re struggling so much to use their meters correctly?  Is there something about the “current first” approach that bogs them down with cognitive load at a stage when they just need some repetitive practice?  I’m curious to check out the CASTLE curriculum, maybe over the summer, to try to figure some of this out.

I created a circuit-recording template last fall that I thought was such a great idea… it had a checklist at the top to help the students notice if they’d forgotten anything.  Guess what?  They started measuring without thinking about the meaning of the measurements — measuring as if it was just something to be check off a list!  No observations.  No questions. No surprise at unusual or unintuitive numbers.  Damn.  The checklist is gone and never coming back — next year I’ll make sure we only measure things that the students have found a reason to measure.

Last term, I waited far too long to give the quiz on measurement technique.  I knew they weren’t ready, and I kept thinking that if we spent more time practicing measuring (while exploring the questions we had painstakingly eked out), that it would get better.  Finally, we were so far behind that I gave the quiz anyway.  The entire class failed it (not a catastrophe, given the reassessment policy), and the most common comment when we reviewed the quiz was “why didn’t you tell us this before??”  Uh.  Right.  Quiz early, quiz often.

Guess what the teacher wants

The degree of “teacher-pleasing” being attempted is disheartening.  Students are almost always uncomfortable making mistakes, using the word “maybe” in situations where it is genuinely the most accurate way to express the strength of our data, or re-evaluating what they think of as “facts.”  But this is unusual.  There’s a high rate of students anxiously making up preposterous answers rather than saying “I don’t know.”

I tend toward a pretty aggressive questioning style — the kind of “what causes that, why does that happen” bluntness I would use with colleagues to bat ideas around.  I’ve changed my verbal prompt to “what might cause that?” and “what could possibly be happening” in the hopes that it would help students discern whether they are certain or not, and also help them transition toward communicating the tentativeness of ideas for which we have little evidence.  Obviously, I take care to draw out the reasoning and evidence in support of ideas, regardless of whether they’re canonical or not, and conversely make sure we discuss evidence against all of our ideas, including the “right” ones. I try to honour students’ questions by tracking them and letting them choose from among the class’s questions when deciding what to investigate next.  But valuing their questions and thinking is clearly not enough.

I gave a test question last semester that asked students to evaluate some “student” reasoning.  It used the word “maybe” in a completely appropriate way, and that’s what I heard outraged responses about from half the class.  They thought the reasoning was poor (and also reported that it was badly written!) because of it.  Again, we practiced explicitly, but sometimes I feel like I’m undermining their faith in “right answer” reasoning without helping them replace it with something better…

On the odd occasion when I ask someone a question and they say “I don’t know,” I make a point of not putting them on the spot, but of gathering info/evidence/ideas from other students for the first student to choose from, or breaking the class into small groups and asking them to discuss.  I try to make sure that the person who said “I don’t know” has as few negative consequences as possible.  Yet the person who says it inevitably looks crestfallen.

Talking in class

The frequency of students speaking up in class is at an all-time low.  I wonder if this has been influenced by my random cold-calling — they figure I’ll call on them eventually so there’s no sense putting their hand up to make a comment or ask a question?  The thing is, they don’t ask those questions when I call on them — just answer the question I ask.

At the same time, the frequency of whispered side conversations is at an all-time high, whether the speaker with the floor is me or another student.  I think I’m unusually sensitive to this — I find it completely distracting, and can barely maintain my train of thought if students are whispering to each other.  Maybe that’s partly my hearing, which is fairly acute — I can actually hear their whole conversation, even if they’re whispering at the back of the room (keep in mind that there are only 17 people and the room is pretty small).  So my standard response to this is one warning during class (followed by a quiet, private conversation after class) — if it happens again, they’re leaving the room.  Is this part of why they’re afraid to talk out loud — because I crack down on the talking under their breath?  I’m open to other ways of responding but out of ideas at the moment.


Even the strongest students are still having trouble explaining causes of physical effects.  They know I won’t accept a formula as a cause, but they can’t explain why, and when I ask someone to explain a cause, they will consistently give a formula anyway (figuring that an answer is always better than no answer, I guess).  Next approaches: asking them to write down the cause, discuss in groups

Scientific Discourse

As Jason articulates clearly, I think that my students need more help motivating and strengthening their scientific discourse.  He summarizes a promising-sounding approach called Guided Reciprocal Questioning as follows:

  1. Learn about something.
  2. Provide generic question frames.
  3. Students generate questions individually.
  4. Students discuss the questions in their groups.
  5. Share out.

I do something similar to #1-3, but I’m ready to try #4-5, with appropriate “discussion frames”, to see if I can help the students hold each other accountable to their knowledge.  Right now, they barely propose questions or answers, but when they do, the class seems to accept it, even if it contradicts something else we just talked about.

Also, Janet Abercrombie wrote recently in the comments about a Question Formulation Technique that I’d like to look into some more.

Conclusion: It works anyway

The whole experience was kind of heart-breaking.  But the conversations with students kept convincing me that I had to do it anyway.  I don’t know how many students took the time to say to me, “whoa, it seems like you actually want us to understand this stuff.”  The look of astonishment really said it all.  The bottom line is, this group is a much better test of the robustness of my methods than last year’s group could be.