You are currently browsing the category archive for the ‘Critical Thinking’ category.

Early Warning Signs of Fascism

A local media outlet recently wrote

“Why the constant, often blatant lying? For one thing, it functioned as a means of fully dominating subordinates, who would have to cast aside all their integrity to repeat outrageous falsehoods and would then be bound to the leader by shame and complicity. “The great analysts of truth and language in politics” — writes McGill University political philosophy professor Jacob T. Levy — including “George Orwell, Hannah Arendt, Vaclav Havel — can help us recognize this kind of lie for what it is…. Saying something obviously untrue, and making your subordinates repeat it with a straight face in their own voice, is a particularly startling display of power over them. It’s something that was endemic to totalitarianism.”

How often does this happen in our classrooms?  How often do we require students to memorize and repeat things they actually think are nonsense?  

  • “Heavy things fall at the same speed as light things.” (Sure, whatever.)
  • “An object in motion will stay in motion forever unless something stops it.” (That’s ridiculous.  Everyone knows that everything stops eventually.  Even planets’ orbits degrade.).
  • When you burn propane, water comes out. (Pul-lease.)
  • The answer to “in January of the year 2000, I was one more than eleven times as old as my son William while in January of 2009, I was seven more than three times as old as him” is somehow not, “why do you not know the age of your own kid?

Real conversation I had with a class a few years ago:

Me: what do you think so far about how weight affects the speed that things fall?

Students (intoning): “Everything falls at the same speed.”

Me: So, do you think that’s weird?

Students: No.

Me: But, this book… I can feel the heaviness in my hand.  And this pencil, I can barely feel it at all.  It feels like the book is pulling harder downward on my hand than the pencil is.  Why wouldn’t that affect the speed of the fall?”

Student: “It’s not actually pulling harder.  It just feels that way, but that’s weight, not mass.”

Me: (weeps quietly)

Please don’t lecture me about the physics.  I’m aware.  Please also don’t lecture me about the terrible fake-Socratic-teaching I’m doing in that example dialogue.  I’m aware of that too.  I’m just saying that students often perceive these to contradict their lived experience, and research shows that outside of classrooms, even those who said the right things on the test usually go right back to thinking what they thought before.

And no, I’m not comparing the role of teachers to the role of Presidents or Prime Ministers.  I do realize they’re different.

Should I Conclude Any of These Things?

  1. Students’ ability to fail to retain or synthesize things that don’t make sense to them is actually a healthful and critically needed form of resistance.
  2. When teachers complain about students and “just memorizing what they need for the test and forgetting it after, without trying to really digest the material,” what we are complaining about is their fascism-prevention mechanism
  3. Teachers have the opportunity to be the “warm up,” the “opening act” — the small-scale practice ground where young minds practice repeating things they don’t believe, thinking they can safely forget them later.
  4. Teachers have the opportunity to be the “innoculation” — the small-scale practice ground where young minds can practice “honoring their dissatisfaction” in a way that, if they get confident with it, might have a chance at saving their integrity, their souls, and their democracy.

Extension Problem

Applying this train of thought to the conventional ways of doing corporate diversity training is left as an exercise for the reader.

 

SBG superhero

I stole this graphic from Kelly O’Shea. If you haven’t already, click through and read her whole blog.

By last winter, the second year students were pretty frustrated.  They were angry enough about the workload to go to my department head about it.  The main bone of contention seemed to be that they had to demonstrate proficiency in things in order to pass (by reassessing until their skills met the criteria), unlike in some other classes where actual proficiency was only required if you cared about getting an A.  Another frequently used argument was, “you can get the same diploma for less work at [other campus.]” Finally, they were angry that my courses were making it difficult for them to get the word “honours” printed on their diploma.  *sigh*

It was hard for me to accept, especially since I know how much that proficiency benefits them when competing for and keeping their first job.  But, it meant I wasn’t doing the Standards-Based Grading sales pitch well enough.

Anyway, no amount of evidence-based teaching methods will work if the students are mutinous.  So this year, I was looking for ways to reduce the workload, to reduce the perception that the workload is unreasonable, and to re-establish trust and respect.  Here’s what I’ve got so far.

1. When applying for reassessment, students now only have to submit one example of something they did to improve, instead of two.  This may mean doing one question from the back of the book.  I suspect this will result in more students failing their reassessments, but that in itself may open a conversation

2. I’ve added a spot on the quiz where students can tell me whether they are submitting it for evaluation, or just for practise.  If they submit it for practise, they don’t have to submit a practise problem with their reassessment application, since the quiz itself is their practise problem.  They could always do this before, but they weren’t using it as an option and just pressuring themselves to get everything right the first time.   Writing it on the quiz seems to make it more official, and means they have a visible reminder each and every time they write a quiz.  Maybe if it’s more top-of-mind, they’ll use it more often.

3. In the past, I’ve jokingly offered “timbit points” for every time someone sees the logic in a line of thinking they don’t share.  At the end of the semester, I always bring a box of timbits in to share on the last day.  In general, I’m against bribery, superficial gamification (what’s more gamified than schooling and grades??), and extrinsic motivation, but I was bending my own rules as a way to bring some levity to the class.  But I realized I was doing it wrong.  My students don’t care about timbits; they care about points.  My usual reaction to this is tight-lipped exasperation.  But my perspective was transformed when Michael Doyle suggested a better response: deflate the currency.

So now, when someone gives a well-thought-out “wrong” answer, or sees something good in an answer they disagree with, they get “critical thinking points“.  At the end of the semester, I promised to divide them by the number of students and add them straight onto everyone’s grade, assuming they completed the requirements to pass.  I’m giving these things out by the handful.  I hope everybody gets 100.  Maybe the students will start to realize how ridiculous the whole thing is; maybe they won’t.  They and I still have a record of which skills they’ve mastered;  and it’s still impossible to pass if they’re not safe or not employable. Since their grades are utterly immaterial to absolutely anything, it just doesn’t matter.  And it makes all of us feel better.

In the meantime, the effect in class has been borderline magical.  They are falling over themselves exposing their mistakes and the logic behind them, and then thanking and congratulating each other for doing it — since it’s a collective fund, every contribution benefits everybody.  I’m loving it.

4. I’ve also been sticking much more rigidly to the scheduling of when we are in the classroom and when we are in the shop.  In the past, I’ve scheduled them flexibly so that we can take advantage of whatever emerges from student work.  If we needed classroom time, we’d take it, and vice versa.  But in a context where people are already feeling overwhelmed and anxious, one more source of uncertainty is not a gift.  The new system means we are sometimes in the shop at times when they’re not ready.  I’m dealing with this by cautiously re-introducing screencasts — but with a much stronger grip on reading comprehension comprehension techniques.  I’m also making the screencast information available as a PDF document and a print document.  On top of that, I’m adopting Andy Rundquist’s “back flip” techniquescreencasts are created after class in order to answer lingering questions submitted by students.  I hope that those combined ideas will address the shortcomings that I think are inherent in the “flipped classroom.”  That one warrants a separate post — coming soon.

The feedback from the students is extremely positive.  It’s early yet to know how these interventions affect learning, but so far the students just seem pleased that I’m willing to hear and respond to their concerns, and to try something different.  I’m seeing a lot of hope and goodwill, which in themselves are likely to make learning (not to mention teaching) a bit easier.  To be continued.

My last post was about encouraging my students to re-evaluate what they think is certain.  I’m trying to help them break the habit of arguing from authority, and encourage them to notice their own thinking… and even to go so far as exposing that thinking to the class!  That’s going to be scary, and it depends on creating a supportive climate.

I responded to a comment on that post, in part: “I do realize that I’m pulling the rug out from under their trust in their own perception of reality, and that’s an unpleasant experience no matter what. Sometimes I think this is actually a spiritual crisis rather than a scientific one.”  To be fair, I’m careful not to suggest that their perception is invalid; only that it is important to notice the evidence that underlies it.  But that means considering the possibility that there isn’t any, or isn’t enough.  In the conversations that follow, the students talk about wondering whether certainty exists at all, and whether anything exists at all, and what knowing means in the first place.  That leads to what it means to “be right”… and then what it means to “do right.”

My best guess is that they have tangled up “right and wrong test answers” with “right and wrong moral behaviour” — being a “good person” means being a good student… usually a compliant one.

So, I’m provoking a moral, or maybe a spiritual, crisis — or maybe exposing an underlying crisis that was there all along.  What do I do about it?  How do I help students enter into that fear without being immobilized or injured by it?  They don’t know what to do when the rigid rules are removed, and I don’t know what to do when they get scared.  What do we do when we don’t know what to do?

Our classroom conversations range over ontology, epistemology, ethics, and, yes, faith. I realize I’m treading on thin ice here; if you think opening a conversation about faith and spirituality in my classroom (or on this blog) is a mistake, I hope you’ll tell me.  But I don’t know how to talk about science without also talking about why it’s not faith, to talk about truth and integrity without talking about what it means to do what’s “right”, why all of these might contribute to your life but one can’t be treated as the other.  And it’s a line of conversation that the students dig into avidly, almost desperately. Putting this stuff on the table seems to offer the best possibilities for building trust, resilience, and critical thinking.

So when the students open  up about their fear and anger around what “right and wrong” can mean, I go there (with care and some trepidation).  I’m careful not to talk about particular sects or creeds — but to invite them to think about what they think of as morally right and wrong,  and why models of atomic structure don’t fit into that structure.

There is occasionally some overlap though.

A historical figure I’ve learned a lot from wrote in her journal about re-evaluating an especially weighty authority…

And then he went on … “Christ saith this, and the apostles say this;’ but what canst thou say?” …  This opened me so, that it cut me to the heart; and then I saw clearly we were all wrong. So I sat down … and cried bitterly… “We are all thieves; we are all thieves; we have taken the [ideas] in words, and know nothing of them in ourselves.

Since this belongs to a particular faith community, I don’t bring it into the classroom.  I think about it a lot though; and it’s the spirit I hope students will bring to their re-evaluation of the high school physics they defend so dearly.

If I expect them to respect the “wrong” (bad?  EVIL??) thinking of their classmates, it’s crucially important that they feel respected.  If I want them to stop arguing from authority, I have to be meticulous about how I use mine. One technique I’m going to try tomorrow is sharing with the class some of the “cool moves” I noticed on the most recent quiz.

Despite my angst about this issue, I’m actually thrilled by the curious, authentic, and humble thinking that’s happening all over the place.  So tomorrow I’ll show some of these (anonymous) examples of non-canonical ideas and explain what I think is good about them.  I’ll especially make sure to seek out a few from the students who are the main arguers from authority.

2 3 4 5 6 7 8 9 10

I’ve done a better job of launching our inquiry into electricity than I did last year.  The key was talking about atoms (which leads to thoughts of electrons), not electricity (which leads to thoughts of how to give someone else an electric shock from an electric fence, lightning, and stories students have heard about death by electrocution).

The task was simple: “Go learn something about electrons, about atoms, and about electrical charge.  For each topic, use at least one quote from the textbook, one online source, and one of your choice.  Record them on our standard evidence sheets — you’ll need 9 in total.  You have two hours.  Go.”

I’ve used the results of that 2-hour period to generate all kinds of activities, including

  • group discussions
  • whiteboarding sessions
  • skills for note-taking
  • what to do when your evidence conflicts
  • how to decide whether to accept a new idea

We practiced all the basic critical thinking skills I hope to use throughout the semester:

  • summarizing
  • asking questions about something even before you fully understand it
  • identifying cause and effect
  • getting used to saying “I don’t know”
  • connecting in-school-knowledge to outside-school experiences
  • distinguishing one’s own ideas from a teacher’s or an author’s

I’m really excited about the things the students have gotten curious about so far.

“When an electron jumps from one atom to the next, why does that cause an electric current instead of a chemical reaction?”

“When an electron becomes a free electron, where does it go?  Does it always attach to another atom?  Does it hang out in space?  Can it just stay free forever?”

“What makes electrons negative?  Could we change them to positive?”

“Are protons the same in iron as they are in oxygen?  How is it possible that protons, if they are all the same, just by having more or fewer of them, make the difference between iron and oxygen?”

“If we run out of an element, say lithium, is there a way to make more?”

“Why does the light come on right away if it takes so long for electrons to move down the wire?”

“What’s happening when you turn off the lights?  Where do the electrons go?  Why do they stop moving?”

“What’s happening when you turn on the light?  Something has to happen to push that electron.  Is there a new electron in the system?”

“With protons repelling each other and being attracted to electrons, what keeps the nucleus from falling apart?”

“What happens if you somehow hold protons and electrons apart?”

“Would there be no gravity in that empty space in the atom?  I like how physics are the same when comparing a tiny atom and a giant universe.”

I’m experimenting with ideas from Nancy Kline’s Time To Think.  She discusses the importance of listening with undivided attention and respect, as a condition for helping people think well.  She asks people to keep their eyes on the speaker, using your face and body to show respect for their thinking.

In class today, I discussed the difference between critiquing the ideas and critiquing the person — that we aren’t here to agree thoughtlessly with everything anyone says, but to discuss (and possibly disagree with) ideas while respecting people as thinkers.

I asked students to show me, with their body and face, what it looks like if you do and do not respect someone.  Here’s what they did.

How to Show Disrespect and Inattention

  • Chat to each other
  • Take out your phone
  • Put your head down on desk
  • Face palm (or worse… DOUBLE face palm!)
  • Hide your eyes or look away

How to Show Respect and Full Attention

  • Eyes on speaker
  • Take notes
  • Smile
  • Ask questions
  • Add comments
  • Back and forth conversation, and (perhaps surprisingly)
  • Use friendly humour

I challenged us to use these techniques to convey our attention and respect as students presented their research.  So far conversations are lively: lots of questions, people are chiming in with supporting evidence, and wondering aloud.  They also joked and let their imagination run a bit with metaphors and analogies.  Sometimes the students asked me to summarize or synthesize if their lines of thought appeared to conflict, but mostly my role was to draw attention to positive moves like using diagrams or physically acting out electrical phenomena with their bodies, and to close the questions so that all groups would have time to present.

Improve Next Time

When someone asks a question that goes beyond the source, presenters often start presenting a new idea that seems plausible as if it’s supported by their research.  How do I help the presenter and the listeners distinguish between their wondering/remembering vs. the source’s information?

How are these students thinking about causality?

What should I ask next?

“Electrical charge is caused due to the movement of electrons from atom to atom.”

“The appearance and properties of atoms are changed cause protons are added or removed from it.”

“Atoms are the basic building block of matter because all matter contains atoms.”

“Atoms are electrons, protons, and neutrons and are bound together by magnetic forces.”

“Electrons excess makes charge negative, while protons excess makes charge positive.  Why are these the charges?”

“Electrons cancel out protons because of the protons’ positive charge.”

“Electrons likely move so slow due to the difficulty of exerting force on them.”

“Electrons in motion cause excess energy called tails.”

“When electrons are further away it causes them to have higher energy levels.”

“The positive parts ‘want’ electrons because they are oppositely charged and so they are attracted to each other.”

“A photon absorbed by an electron causes it to escape from the atom.”

“What causes charge to never be created or destroyed?”

 

 

 

 

 

As the year winds down, I’m starting to pull out some specific ideas that I want to work on over the summer/next year.  The one that presses on me the most is “readiness.”  In other words,

  • What is absolutely non-negotiable that my students should be able to do or understand when they graduate?
  • How to I make sure they get the greatest opportunity to learn those things?
  • How do I make sure no one graduates without those things?  And most frustratingly,
  • How do I reconcile the student-directedness of inquiry learning with the requirements of my diploma?

Some people might disagree that some of these points are worth worrying about.  If you don’t teach in a trade school, these questions may be irrelevant or downright harmful.  K-12 education should not be a trade school.  Universities do not necessarily need to be trade schools (although arguably, the professional schools like medicine and law really are, and ought to be).  However, I DO teach in a trade school, so these are the questions that matter to me.

Training that intends to help you get a job is only once kind of learning, but it is a valid and important kind of learning for those who choose it.  It requires as much rigour and critical thinking as anything else, which becomes clear when we consider the faith we place in the electronics technicians who service elevators and aircraft. If my students are inadequately prepared in their basic skills, they (or someone else, or many other people) may be injured or die. Therefore, I will have no truck with the intellectual gentrification that thinks “vocational” is a dirty word. Whether students are prepared for their jobs is a question of the highest importance to me.

In that light, my questions about job-readiness have reached the point of obsession.  Being a technician is to inquire.  It is to search, to question, to notice inconsistencies, to distinguish between conditions that can and cannot possibly be the cause of particular faults.  However, teaching my students to inquire means they must inquire.  I can’t force it to happen at a particular speed (although I can cut it short, or offer fewer opportunities, etc.).  At the same time, I have given my word that if they give me two years of their time, they will have skills X, Y, and Z that are required to be ready for their jobs.  I haven’t found the balance yet.

I’ll probably write more about this as I try to figure it out.  In the meantime, Grant Wiggins is writing about a recent study that found a dramatic difference between high-school teachers’ assessment of students’ college readiness, and college profs’ assessment of the same thing.  Wiggins directs an interesting challenge to teachers: accurately assess whether students are ready for what’s next, by calibrating our judgement against the judgement of “whatever’s next.”  In other words, high school teachers should be able to predict what fraction of their students are adequately prepared for college, and that number should agree reasonably well with the number given by college profs who are asked the same question.  In my case, I should be able to predict how well prepared my students are for their jobs, and my assessment should match reasonably the judgement of their first employer.

In many ways I’m lucky: we have a Program Advisory Group made up of employer representatives who meet to let us know what they need. My colleagues and I have all worked between 15 and 25 years in our field. I send all my students on 5-week unpaid work terms.  During and after the work terms, I meet with the student and the employer, and get a chance to calibrate my judgement.  There’s no question that this is a coarse metric; the reviews are influenced by how well the student is suited to the culture of a particular employer, and their level of readiness in the telecom field might be much higher than if they worked on motor controls.  Sometimes employers’ expectations are unreasonably high (like expecting electronics techs to also be mechanics).  There are some things employers may or may not expect that I am adamant about (for example, that students have the confidence and skill to respond to sexist or racist comments).    But overall, it’s a really useful experience.

Still, I continue to wonder about the accuracy of my judgement.  I also wonder about how to open this conversation with my colleagues.  It seems like something it would be useful to work on together.  Or would it?  The comments on Wiggins’ post are almost as interesting as the post itself.

It seems relevant that most commenters are responding to the problem of students’ preparedness for college, while Wiggins is writing about a separate problem: teachers’ unfounded level of confidence about students’ preparedness for college.

The question isn’t, “why aren’t students prepared for college.”  It’s also not “are college profs’ expectations reasonable.”  It’s “why are we so mistaken about what college instructors expect?

My students, too, often miss this kind of subtle distinction.  It seems that our students aren’t the only ones who suffer from difficulty with close reading (especially when stressed and overwhelmed).

Wiggins calls on teachers to be more accurate in our assessment, and to calibrate our assessment of college-readiness against actual college requirements. I think these are fair expectations.  Unfortunately, assessment of students’ college-readiness (or job-readiness) is at least partly an assessment of ourselves and our teaching.

A similar problem is reported about college instructors.  The study was conducted by the Foundation for Critical Thinking with both education faculty and subject-matter faculty who instruct teacher candidates. They write that many profs are certain that their students are leaving with critical thinking skills, but that most of those same profs could not clearly explain what they meant by critical thinking, or give concrete examples of how they taught it.

Self-assessment is surprisingly intractable; it can be uncomfortable and can elicit self-doubt and anxiety.  My students, when I expect them to assess their work against specific criteria, exhibit all the same anger, defensiveness, and desire to change the subject as seen in the comments.  Most of them literally can’t do it at first.  It takes several drafts and lots of trust that they will not be “punished” for admitting to imperfection.  Carol Dweck’s work on “growth mindset” comes to mind here… is our collective fear of admitting that we have room to grow a consequence of “fixed mindset”?  If so, what is contributing to it? In that light, the punitive aspects of NCLB (in the US) or similar systemic teacher blaming, isolation, and lack of integrated professional development may in fact be contributing to the mis-assessment reported in the study, simply by creating lots of fear and few “sandboxes” of opportunity for development and low-risk failure.  As for the question of whether education schools are providing enough access to those experiences, it’s worth taking a look at David Labaree’s “The Trouble with Ed School.”

One way to increase our resilience during self-assessment is to do it with the support of a trusted community — something many teachers don’t have.  For those of us who don’t, let’s brainstorm about how we can get it, or what else might help.  Inaccurate self-assessment is understandable but not something we can afford to give up trying to improve.

I’m interested in commenter I Hodge’s point about the survey questions.  The reading comprehension question allowed teachers to respond that “about half,” “more than half,” or “all, or nearly all” of their students had an adequate level of reading comprehension.  In contrast, the college-readiness question seems to have required a teacher to select whether their students were “well,” “very well,” “poorly,” or “very poorly” prepared.  This question has no reasonable answer, even if teachers are only considering the fraction of students who actually do make it to college.  I wonder why they posed those two questions so differently?

Last but not least, I was surprised that some people blamed college admissions departments for the admission of underprepared students.  Maybe it’s different in the US, but my experience here in Canada is that admission is based on having graduated high school, or having gotten a particular score in certain high school courses.  Whether under-prepared students got those scores because teachers under-estimated the level of preparation needed for college, or because of rigid standards or standardized tests or other systemic problems, I don’t see how colleges can fix, other than by administering an entrance test.  Maybe that’s more common than I know, but neither the school at which I teach nor the well-reputed university that I (briefly) attended had one.  Maybe using a high school diploma as the entrance exam for college/university puts conflicting requirements on the K-12 system?  I really don’t know the answer to this.

Wiggins recommends regularly bringing together high-school and college faculty to discuss these issues.  I know I’d be all for it.  There is surely some skill-sharing that could go back and forth, as well as discussions of what would help students succeed in college.  Are we ready for this?

When we start investigating a new topic or component, I often ask students to make inferences or ask questions by applying our existing model to the new idea.  For example, after introducing an inductor as a length of coiled wire and taking some measurements, I expect students to infer that the inductor has very little voltage across it because wires typically have low resistance.  However, for every new topic, some students will assume that their current knowledge doesn’t relate to the new idea at all.  Although the model is full of ideas about voltage and current and resistance and wires, “the model doesn’t have anything in it about inductors.”

There are a few catchphrases that damage my calm, and this is one of them.  I was discussing it with my partner’s daughter, who’s a senior in high school, and often able to provide insight into my students’ thinking.  I was complaining that students seem to treat the model (of circuit behaviour knowledge we’ve acquired so far) like their baby, fiercely defending it against all “threats,” and that I was trying to convince them to have some distance, to allow for the possibility that we might have to change the model based on new information, and not to take it so personally.  She had a better idea: that they should indeed continue to treat the model like a baby — a baby who will grow and change and isn’t achieving its maximum potential with helicopter parents hovering around preventing it from trying anything new.

The next time I heard the offending phrase, I was ready with “How do you expect a baby model to grow up into a big strong model, unless you feed it lots of nutritious new experiences?

It worked.  The students laughed and relaxed a bit.  They also started extending their existing knowledge.  And I relaxed too — secure in the knowledge that I was ready for the next opportunity to talk about “growth mindset for the model.”

My students use the same assessment rubric for practically every new source of information we encounter, whether it’s something they read in a book, data they collected, or information I present directly.  It asks them to summarize, relate to their experience, ask questions, explain what the author claims is the cause, and give support using existing ideas from the model.  The current version looks like this (click through to zoom or download):

Assessment for Learning

There are two goals:

  • to assess the author’s reasoning, and help us decide whether to accept their proposal
  • to assess one’s own understanding

If you can’t fill it in, you probably didn’t understand it.  Maybe you weren’t reading carefully, maybe it’s so poorly reasoned or written that it’s not actually understandable, or maybe you don’t have the background knowledge to digest it.  All of these conditions are important to flag, and this tool helps us do that.

The title says “Rubric for Assessing Reasoning,” but we just call them “feedbacks.”

Recently, there have been a spate of feedbacks turned in with the cause and/or the “support from the model” section left blank or filled with vague truisms (“this is supported by lots of ideas about atoms,” or “I’m looking forward to learning more about what causes this.”)

I knew the students could do better — all of them have written strong statements about cause in the past (in chains of cause and effect 2-5 steps long).  I also allow students to write a question about cause, instead of a statement, if they can’t tell what the cause is, or if they think the author hasn’t included it.

So today, after I presented my second draft of some information about RMS measurements, I showed some typical examples of causal statements and supporting ideas.  I asked students to rate them according to their significance to the question at hand, then had some small group discussions.  I was interested (and occasionally surprised) by their criteria for what makes a good statement of cause, and what makes a good supporting idea.  Here’s the handout I used to scaffold the discussions.

The students’ results:

A statement of cause should …

  • Be relevant to the question
  • Help us understand the question or the answer
  • Not leave questions unanswered
  • Give lots of info
  • Relate to the model
  • Explain what physically makes something happen or asks a question that would help you understand the physical cause
  • Help you distinguish between similar things (like the difference between Vpk, Vpp, Vrms)
  • Not beg the question (not state the same thing twice using different words)
  • Be concrete
  • Make the new ideas easier to accept
  • Use definitions

Well, I was looking for an excuse to talk about definitions — I think this is it!

Supporting ideas from the model should…

  • Help clarify how the electrons work
  • Help answer or clarify the question
  • Directly involve information to help relate ideas
  • Help us see what is going on
  • Give us reasoning so we can in turn have an explanation
  • Clarify misunderstandings
  • Allow you to generalize
  • Support the cause, specifically.
  • Be specific to the topic, not broad (like, “atoms are made of protons, electrons, and neutrons.”)
  • Not use a formula
  • It helps if you understand what’s going on, it makes it easier to find connections

The Last World

Which ones would you emphasize? What would you add?

How can I help students make causal thinking a habit?  I’ve written before about my struggles helping students “do cause” consistently, and distinguishing between “what made it happen” vs. “what made me think it would happen.”  Most recently, I wrote about how using a biological model of the growing brain might help develop the skills needed to talk about a physical model of atomic particles.

Sweng1948 commented that cause and definition become easy to distinguish when we talk about pregnancy, and seemed a little concerned that it would come off as flippant.  To me, it doesn’t — especially because I use that example all the time. Specifically, I talk about the difference between “who/what you are” (the definition of you) and “what caused you” (a meeting of sperm and egg).  In the systems of belief that my students tend to have, people are not thought to “just happen” or “cause themselves.”  It can help open the conversation.  However, even when I do this, they are surprisingly unlikely to transfer that concept to atomic particles.

Biology Vs. Physics

My students seem to regard cause differently in biology vs. physics.  They are likely to say that eating poorly causes malnutrition and eating well contributes to causing good health; they are less likely to say that the negative charge of electrons causes them to move apart, and more likely to say that electrons move apart because they’re electrons, and that’s what electrons do.

Further, once they conclude that moving two electrons apart causes their repulsion to weaken, they are unable to decide whether moving them closer together strengthens it (I have no idea what to do about this).  It’s also often opaque to students whether one electron is repelling the other, or the second one is repelling the first. This happens in various contexts: the other day, a student presented the idea that cooling a battery would lower its voltage.  Several students were frustrated because they had asked what would raise a battery’s voltage, not what would lower it, and were a bit aggressive in telling the presenting student that he had not answered their question.

That’s one of the reasons I was interested in using this “brain” model as a way to open the conversation about causality and models in general; they do cause better with biology.  I’ll have to figure out next year how to build a bridge between cells and atoms…

I’m not sure why it’s so difficult.  Here are a few stabs at it:

  1. Is it because they see causality as connected to intention — in other words, you are only causing things if you do them on purpose?
  2. Does their experience of their own conscious agency helps them see how their choices are causes that have demonstrable effects — such that things that don’t have choices also seem not to cause things?
  3. Is it because living things are easier to see and relate to than electrons?
  4. Is it because they see cause as inextricably linked to desire?   Something like, “What caused me to buy a bag of candy is that I wanted it. So, electrons must move because they want to.”

I sometimes fool myself into thinking that my students have understood some underlying principle when they anthropomorphize particles and forces: “The electron wants to move toward the proton.” “Voltage is when an electron is ready to move to another atom.”  I assume that they are constructing a metaphor to symbolize what’s going on, or using a verbal shorthand.  Then I realize, many students don’t think of the electron’s “desire” as a metaphor, and can’t connect this to ideas about energy, charge, etc. Consider this my plea to K-12 teachers not to say that stuff, and when students bring it up, to engage with them about what exactly that means.  Desires are things we can use willpower to sublimate.  Forces, not so much.  That’s why it’s called force.

Something about cause leads to students treating particles (and, for that matter, compilers and microprocessors) as if they, like people, might act the way we expect, but they also might not.  I can’t tell whether it’s because there could be an opposing force, or “just because.”  If it’s the former, then there’s a kernel of intellectual humility here that I respect: a sort of submission to the possibility that there are forces we don’t understand, and our model will only work if there are no opposing forces we haven’t accounted for.  However, I often can’t find out whether they’re talking/thinking about science or faith, because the responses to my questions are often defensive, along the lines of “My physics teacher said it’s complicated.  The reason they didn’t teach it to us in high school is that it’s just too hard for anyone to learn, unless they’re a theoretical physicist.” (*sigh*. Hoping the growth-mindset ideas will help with this).

We Can’t Understand It Fully, So There’s No Point

Also, the “we don’t understand it fully” shrug seems to be anti-generative: it leads to an intellectual abdication.  It’s a defence against the idea that we should just go ahead and use our model to make predictions, then test the predictions to find the holes in the model.  Or maybe I’ve got it backwards — maybe the intellectual abdication causes the shrug.  I’m back to growth mindset again, but not about growing ourselves — growth mindset for the model too!  Fixed mindset says there’s no point making a prediction that might be wrong.  Only a growth mindset sees the value in testing a prediction with the intention of helping the model (and ourselves) get stronger.

I expect that the word “potential” is part of the problem here (as in, potential difference and potential energy) — to my students, “potential” means something that you need to make a decision about. They say that they will “potentially” go to the movies that night, which means they haven’t chosen yet.  By that logic, if you have a “potential difference”, that means there might be a difference, but there might not, too. Depending on what the electron decides.  Potential energy?  Maybe you’ve got (or will later have) energy, maybe you don’t.  What’s strong about this thinking is that they’re right that there’s something that “might or might not” happen (current, acceleration, etc.).  What’s frustrating is that I don’t know how to help them unpack the difference between a “force” and a “decision” in a way that actually helps.

(And no, the connections to the uncertainty principle, the observer effect, the unpredictability of chaotic systems, and the challenges to causality posed by modern physics are not lost on me… but I’d rather my students work through “wrong” conclusions via confidence in reasoning, than come to some shadow of the “right” conclusions via an assumption of their own intellectual inadequacy.)

Archives

I’M READING ABOUT