You are currently browsing the category archive for the ‘Inquiry’ category.

I wrote last month about new approaches I’m using to find out what students think, keep track of who thinks what, and let the curriculum be guided by student curiosity. When Dan Meyer reblogged it recently, an interesting conversation started in the comments on that site.  The question seems to be, “how is this different from common practises?”  It sparked my thinking, so I thought I’d continue here.  If you’re a new reader, welcome.

Just Formative Assessment?

It may be helpful to know that I’m teaching a community college course on the basics of electricity.  The students come in brimming with questions, assumptions, and ideas about how electricity works in their lives — phone chargers, car batteries, electric fences, solar panels.  And all new knowledge gets judged immediately in that court of everyday life. What I’m trying to do better is to discover students’ pre-existing ideas and questions, especially the ones I wouldn’t have anticipated.

I agree that there is a way in which this is nothing new; in a way, it’s the definition of formative assessment.

Many formative assessments inquire into students’ thinking as a T/F question: did they get it, yes or no?  Others ask the question as if it’s multiple choice: are their ideas about motion Aristotelian, Newtonian, or something else? (See Hestenes’ work leading to the Force Concept Inventory).  Some assessments focus on misconceptions: which of these mistaken ways of thinking are causing their problems?  Typically there is some instruction or exercise or activity, and then we try to find out what they got out of it.  Or maybe it’s a pre-assessment, and we use the information to address and correct misconceptions.

I’m trying to shift to essay questions:  Not “Do they think correctly” but “What do they think?”  I’m trying to shift it to a different domain: not “what do they think about how this topic was just taught in this class” but “what have they ever thought about this topic, in all the parts of their lives, and how can we weave them together?”  I also hope to ask it for a different reason: not just, “which parts of their ideas are correct”  but also “which parts of their pre-existing ideas are most likely to lead to insight or perplexity?

As Dan points out, there is a “part 2”: This isn’t just about shifting what I do (keep a spreadsheet where I record student ideas and questions, tagged by topic and activity they were working on when they asked it).  It’s also about shifting my self-assessment.  The best activities aren’t just the ones that help students solve problems; the best assessments yield the most honest student thinking.

Which of the activities in your curriculum would you rank highest on that scale?

What do you think makes them work?

Pros: Student Honesty and Motivation

This year, I’ve got a better handle not only on who holds which ideas,  which ideas are half-digested, applied inconsistently or in a self-contradictory way, and what the students are curious about.

The flashlights you shake to charge — do they work like how friction can transfer electrons from a cat’s fur to a glass rod?

What happens if you try to charge a battery but the volts are lower than the battery you’re trying to charge?

Batteries’ don’t get heavier when you charge them — that is evidence that electrons don’t weigh much.

For example, if I was looking for a way into the superposition theorem, I couldn’t ask for better than this.

Cons: Fear and Conflict

I’ve written extensively about the fear, anger, conflict, and defensiveness that come to the surface when I encourage students to build a practise of constant re-evaluation, rather than certainty.   What are your suggestions for helping students re-evaluate things when they’re sure they already know it? What are your suggestions for helping students notice when common sense pre-conceptions and new ideas aren’t talking to each other?

Bonus points: what are your suggestions for helping teachers re-evaluate things when we’re sure we already know it?  What about for helping teachers notice when our common sense pre-conceptions and new ideas aren’t talking to each other?

Why Am I Obsessed With This?

This is the fear that keeps me awake at night:

The students in the first example had learned in class not to discuss certain aspects of their own ideas or models. In particular, they had learned not to talk about “What things are like?” …

The students in my second and third examples had learned that their ideas were worthless (and confusing to think about).

The problem with (some) guided inquiry like this is the illusion of learning. Instructors doing these kinds of “check outs” can convince themselves that students are building powerful scientific models, but really students are just learning not to share any ideas that might be wrong, not to have conversations that they aren’t supposed to have, and to hide interesting questions and insights that are outside the bounds of the “guided curriculum”.

At the end of the day, if students are learning to avoid taking intellectual risks around the instructor, that instructor doesn’t stand a chance of helping those students learn.

(Read the whole thing from Brian Frank)

Which kinds of assessments do you think discourage students from taking intellectual risks around the instructor?  My gut feeling is that anything along the lines of “elicit-confront-resolve” is a major contributor, but I hope that having more data to look at will help me confirm this.

Pros: I Get Honest and Motivated Too

To be clear, I’m not suggesting that no one else has ever done this.  It’s common to ask students “how were you thinking it through”, such as when discussing a mistake they made on a test.

I don’t want to just do it, though.  I want to do it better than I did last year.  I want to systematically keep track of student ideas and, together with the students, use those ideas to co-create the curriculum. Even the wrong ideas.  Especially the wrong ideas.  I want them to see what’s good in their well-thought out, evidence-based “wrong” answers, and see what’s weak about poorly thought out, unsubstantiated “right” answers.  I want them to do the same for the ideas of their classmates, especially the ideas they don’t share.

It means that sometimes we go learn about a different topic.  If they’re generating curiosity and insight about parallel circuits, I’m not going to force them to shift to series circuits.  It wastes momentum (not to mention goodwill… or what you might call “engagement” or “motivation”).  They know what the goal of the course is; they’ve paid good money and invested their time in reaching that goal.  We come up with a plan together of what it makes sense to learn about next, so that we move closer to the goal.

Want to help me improve?  Here’s the help I could really use.   If you were one of the people whose first reaction to my original post was “I already know that” — either I already know that to be true, or I already know that to be false… what would have helped you respond with curiosity and perplexity, adding your idea as a valuable one of many?  If that was your response, what made it work?

Keep calm because I already knowCreating a classroom culture of inquiry is getting better and better every September in most ways. It’s especially working well to reassure the students with little previous physics experience, to excite the students with previous unpleasant experiences with physics, to challenge the students who found previous physics classes boring or stifling, and to empower students who’ve been marginalized by schooling in general. But one thing I’m still struggling with is responding well to the students who have been taught to uncritically regurgitate correct answers — and who’ve embraced it.

How do I get curious about their ideas?  My conflict mediation coach suggests finding out what need that meets, what they got from that experience that they’re not getting elsewhere.  I confess that I’m afraid to find out.  I’m also afraid of the effect they have on the other students.  Their dismissive insistence that other people’s theories are “wrong” can quickly undo weeks of carefully cultivating a spirit of exploring and evaluating the evidence ourselves; their pat answers to other people’s questions make it seem like it’s stupid to be curious at all.

I have a bunch of options here… one is an activity called “Thinking Like a Technician” where I introduce the idea that “believing” is different from provisionally accepting the theory best supported by the current evidence.  I show the Wikipedia page for atomic theory to draw out the idea that there are many models of the atom, that all of them are a little bit wrong, and that our job is to choose which one we need for which situations, rather than to figure out which one is right.  That seems to help a bit, and give us some points of reference to refer back to.

I show a video with Malcolm Longair and Michio Kaku explaining that atoms are made of mostly nothingness.  But I think it makes it worse.  The students who are excited get more excited; the ones who feel like I’m threatening the authority of the high school physics teachers they idolize get even angrier.  For the rest of the class, it’s wonderful — but for this subset, it’s uncomfortably close to Elicit-Confront-Resolve.  They experience it as a form of “expose-and-shame“, and unsurprisingly retaliate.  If they can’t find some idea of mine to expose and shame, they’ll turn on the other students.

Something I’m trying to improve: How do I help students re-evaluate things that seem solid?  It’s not just that they respond with defensiveness; they also tend to see the whole exercise of inquiry (or, as some people call it, “science”) as a waste of time.  What could make it worth re-examining the evidence when you’re that sure?

 

My definition of “inquiry” as an educational method: it’s the students’ job to inquire into the material, and while they do that, it’s my job to inquire into their thinking.

So yes, the goal is really “inquiry-based learning”.  I’ve written lots before about what the students do.  But this post is about what I do. I have to inquire at least as much as the students do.

I’ve written that before, more than once… but do you think I can find it on my own blog?  Nope.  Also, I stole it originally, probably from Brian Frank.  Do you think I can find it on his blog?  *sigh*  If anyone finds it, in either place, let me know, would ya?

What’s new about my ability to inquire into my students’ thinking is that I’m treating it more like a qualitative research project.  Someday I’ll go take a qualitative methods course and actually know what I’m talking about (I’m taking suggestions for methods texts, courses you liked, or profs you respect)… but until then, I’m muddling through better than usual.

Activities That Help Me Inquire Into Student Thinking

Playdough Circuit

Published by Science Buddies

We spend the first week doing things that are designed for them to play with their current ideas and me to learn to find out about them.  In the past I set out piles of AA batteries, light bulbs, sockets,  and switches.  I’d ask students to build a circuit that worked, one that looked like it should but didn’t, and a third one of any description.  Students drew their circuit on paper and wrote down what they noticed, as well as what they wondered (props to Brian again for the wording of the prompt, which helps break down the fear induced by writing the “wrong” thing in a lab report “observation” section).  The noticing and wondering helps me learn a lot about their ideas.

This year I added a day before light bulbs where they made circuits out of playdough.  It was silly, messy, and fun.  It also yielded lots of new info about their thinking about electrons, voltage, current, charge, etc., which I asked them to record on this handout.

.

Record-Keeping

Whatever they write down ends up in a spreadsheet that looks like this:

2015 Intake ideas so far Name Date Context V R I P C Energy Potential
voltage is potential difference amount of potential energy between points XXXXXX 09-Sep-15 Squishy Circuits x x x
Insulators stop energy from passing through XXXXXX 09-Sep-15 Squishy Circuits x
Conductors allow the transfer of energy XXXXXX 09-Sep-15 Squishy Circuits x

.

I just keep adding tags on the right to keep track of whatever topic I need to keep track of.  That way I can sort by topic, by date, or by student.  It also helps me see which activities yielded what kind of curiosity.

My Favourite Ideas So Far

What holds matter together?

Are electrons what power actually is?

Batteries in a row must connect to each other like how magnets connect together to attract each other (2 negatives connected doesn’t work)

Closing the switch should double the power supply, but there was no noticeable difference. Why?

When negative side of battery reaches positive side of other battery, shouldn’t it be a complete circuit?

Put the switch on the other side of the bulb.  Does it matter?

Why did the 2 dim lights light at all, when the path of least resistance was through the 1 light bulb path?  In my “double the wires” circuit, they didn’t light at all.

Why don’t any of the bulbs turn on?  I would have thought that at least the first bulb would faintly glow.

Resistance is how much current is lost in the current

What separates Watts from Volts?

If I Inqire Into My Own Thinking…

What’s the pattern here about which ideas are exciting to me?  Well, quite a few of them are challenges to common misconceptions.  Despite my resistance, it seems I’ve still got a bit of a case of misconception listening.

The other pattern is that they all point either to questioning cause, or improving precision.  Those are discipline-specific skills, part of the “implicit curriculum” that people in my field often think of as unlearnable “aptitudes” instead of skills.  So there’s a practise of inclusion underlying my choices — making these skills explicit benefits everyone but especially the people with little previous exposure to technical fields.  Cause and precision are also things that I personally find satisfying and beautiful.  No coincidence about the overlap — I chose my field for a reason.  I’ll have to be careful to encourage curiosity wherever I find it, not just in the students who ask the kinds of questions I like best.

I’ve done a better job of launching our inquiry into electricity than I did last year.  The key was talking about atoms (which leads to thoughts of electrons), not electricity (which leads to thoughts of how to give someone else an electric shock from an electric fence, lightning, and stories students have heard about death by electrocution).

The task was simple: “Go learn something about electrons, about atoms, and about electrical charge.  For each topic, use at least one quote from the textbook, one online source, and one of your choice.  Record them on our standard evidence sheets — you’ll need 9 in total.  You have two hours.  Go.”

I’ve used the results of that 2-hour period to generate all kinds of activities, including

  • group discussions
  • whiteboarding sessions
  • skills for note-taking
  • what to do when your evidence conflicts
  • how to decide whether to accept a new idea

We practiced all the basic critical thinking skills I hope to use throughout the semester:

  • summarizing
  • asking questions about something even before you fully understand it
  • identifying cause and effect
  • getting used to saying “I don’t know”
  • connecting in-school-knowledge to outside-school experiences
  • distinguishing one’s own ideas from a teacher’s or an author’s

I’m really excited about the things the students have gotten curious about so far.

“When an electron jumps from one atom to the next, why does that cause an electric current instead of a chemical reaction?”

“When an electron becomes a free electron, where does it go?  Does it always attach to another atom?  Does it hang out in space?  Can it just stay free forever?”

“What makes electrons negative?  Could we change them to positive?”

“Are protons the same in iron as they are in oxygen?  How is it possible that protons, if they are all the same, just by having more or fewer of them, make the difference between iron and oxygen?”

“If we run out of an element, say lithium, is there a way to make more?”

“Why does the light come on right away if it takes so long for electrons to move down the wire?”

“What’s happening when you turn off the lights?  Where do the electrons go?  Why do they stop moving?”

“What’s happening when you turn on the light?  Something has to happen to push that electron.  Is there a new electron in the system?”

“With protons repelling each other and being attracted to electrons, what keeps the nucleus from falling apart?”

“What happens if you somehow hold protons and electrons apart?”

“Would there be no gravity in that empty space in the atom?  I like how physics are the same when comparing a tiny atom and a giant universe.”

When we start investigating a new topic or component, I often ask students to make inferences or ask questions by applying our existing model to the new idea.  For example, after introducing an inductor as a length of coiled wire and taking some measurements, I expect students to infer that the inductor has very little voltage across it because wires typically have low resistance.  However, for every new topic, some students will assume that their current knowledge doesn’t relate to the new idea at all.  Although the model is full of ideas about voltage and current and resistance and wires, “the model doesn’t have anything in it about inductors.”

There are a few catchphrases that damage my calm, and this is one of them.  I was discussing it with my partner’s daughter, who’s a senior in high school, and often able to provide insight into my students’ thinking.  I was complaining that students seem to treat the model (of circuit behaviour knowledge we’ve acquired so far) like their baby, fiercely defending it against all “threats,” and that I was trying to convince them to have some distance, to allow for the possibility that we might have to change the model based on new information, and not to take it so personally.  She had a better idea: that they should indeed continue to treat the model like a baby — a baby who will grow and change and isn’t achieving its maximum potential with helicopter parents hovering around preventing it from trying anything new.

The next time I heard the offending phrase, I was ready with “How do you expect a baby model to grow up into a big strong model, unless you feed it lots of nutritious new experiences?

It worked.  The students laughed and relaxed a bit.  They also started extending their existing knowledge.  And I relaxed too — secure in the knowledge that I was ready for the next opportunity to talk about “growth mindset for the model.”

This year I’ve really struggled to get conversation going in class.  I needed some new ways to kick-start the questioning, counter-example-ing, restating, and exploring implications that fuel inquiry-based science.  I suspected students were silent because they were afraid that their peers and/or I would find out what they didn’t know.  I needed a more anonymous way for them to ask questions and offer up ideas.

About that time, I read Mark Guzdial’s post about Peer Instruction in Computer Science.  While exploring the resources he recommends, I found this compelling and very short PI teacher cheat sheet. I was already curious because Andy Rundquist and Joss Ives were blogging about interesting ways to use PI, even with small groups.  I hadn’t looked into it because, until this year, I’ve never been so unsuccessful in fostering discussion.

The cheat-sheet’s clarity and my desperation to increase in-class participation made me think about it differently.  I realized I could adapt some of the techniques, and it worked — I’ve had a several-hundred-percent increase in students asking questions, proposing ideas, and taking part in scientific discourse among themselves.    Caveat: what I’m doing does not follow the research model proposed by PI’s proponents.  It just steals some of their most-easily adopted ideas.

What is Peer Instruction (PI)?

If you’re not familiar with it, the basic idea is that students get the “lecture” before class (via readings, screencasts, etc), then spend class time voting on questions, discussing in small groups, and voting again as their understanding changes.  Wikipedia has a reasonably clear and concise entry on PI, explaining the relationship between Peer Instruction, the “flipped classroom”, and Just-In-Time Teaching.

Why It’s Not Exactly PI

My home-made voting flashcards

My home-made voting flashcards

  • I don’t have clickers, and don’t have any desire for them.  If needed, I use home-made voting cards instead.  Andy explains how effective that can be.
  • I prefer to use open-ended problems, sometimes even problems the students can’t solve with their current knowledge, rather than multiple-choice questions.  That’s partly because I don’t have time to craft good-quality MC items, partly because I want to make full use of the freedom I have to follow students’ noses about what questions and potential answers are worth investigating.
  • Update (Feb 19): I almost forgot to mention, my classroom is not flipped.  In other words, I don’t rely on before-class readings, screencasts, etc.

What About It is PI-Like?

  1. I start with a question for students to tackle individually.  Instead of multiple-choice, it could be a circuit to analyze, or I might ask them to propose a possible cause for a phenomenon we’ve observed.
  2. I give a limited amount of time for this (maybe 2-3 minutes), and will cut it even shorter if 80% of students finish before the maximum time.
  3. I monitor the answers students come up with individually.  Sometimes I ask for a vote using the flashcards.  Other times I just circulate and look at their papers.
  4. I don’t discuss the answers at that point.  I give them a consistent prompt: “In a moment, not right now but in a moment, you’re going to discuss in groups of 4.  Come to agreement on whatever you can, and formulate questions about whatever you can’t agree on.  You have X minutes.  Go.”
  5. I circulate and listen to conversations, so I can prepare for the kinds of group discussion, direct instruction, or extension questions that might be helpful.
  6. When we’re 30 seconds from the end, or when the conversation starts to die down, I announce “30 more seconds to agree or come up with questions.”
  7. Then, I ask each group to report back.  Usually I collect all the questions first, so that Group B doesn’t feel silenced if their question is answered by Group A’s consensus. Occasionally I ask for a flashcard vote at this point; more often, collect answers from each group verbally. I write them on the board — roughly fulfilling the function of “showing the graph” of the clicker results.
  8. If the answers are consistent across the group and nothing needs to be clarified, I might move on to an extension question.  If something does need clarification, I might do some direct instruction.  Either way, I encourage students to engage with the whole group at this point.

Then we’re ready to move on — maybe with another round, maybe with an extension question (the cheat-sheet gives some good multi-purpose prompts, like “What question would make Alternate Answer correct?”).  I’m also a fan of “why would a reasonable person give Alternate Answer?”

Why I Like It

It doesn’t require a ton of preparation.  I usually plan the questions I’ll use (sometimes based on their pre-class reading which, in my world, actually in-class reading…).  But, anytime during class that I feel like throwing a question out to the group, I can do this off the cuff if I need to.

During the group discussion phase (Step 4), questions and ideas start flowing and scientific discourse flourishes.  Right in this moment, they’re dying to know what their neighbour got, and enjoy trying to convince each other.  I don’t think I buy the idea that these techniques help because students learn better from each other — frankly, they’re at least as likely to pseudoteach each other as I am.  I suspect that the benefit comes not so much from what they hear from others but from what they formulate for themselves.   I wish students felt comfortable calling that stuff out in a whole group discussion (with 17 of us in the room, it can be done), but they don’t.  So.  I go with what works.

No one outside the small group has to know who asked which questions.  The complete anonymity of clickers isn’t preserved, but that doesn’t seem to be a problem so far.

Notes For Improvement

There are some prompts on the cheat sheet that I could be using a lot more often — especially replacing “What questions do you have” or “What did you agree on” with “What did you group talk about,” or “If your group changed its mind, what did you discuss?”

There’s also a helpful “Things Not To Do (that seemed like a good idea at the time)” page that includes my favourite blooper — continuing to talk about the problem after I’ve posed the question.

If I was to add something to the “What Not To Do” list, it would be “Shifting/pacing while asking the question and immediately afterwards.”  I really need to practice holding still while giving students a task, and then continuing to hold still until they start the task.   My pacing distracts them and slows down how quickly they shift attention to their task; and if I start wandering the room immediately, it creates the impression that they don’t have to start working until I get near enough to see their paper.

Overheard while the students discussed the difference between I vs. V characteristics of light bulbs and diodes.

 

Facilitating the process:

What else do we know?

Are we going to analyze predictions and measurements?  Or just measurements?

So forward voltage is one category, reverse is another?

So, what have we concluded so far?

Do we have to write down our data?

I’m going to keep writing down the data.

So basically what you had was…

Were you maybe reading it like…

So what should we put here?

 

Seeking Causes:

But it wouldn’t be through the LED.  The voltmeter was shorting out the LED.

So they’re about the same, what’s the reason for that?

 

Holding our thinking to the model:

So this is actually supporting our idea…

One thing I noticed was that as voltage increased, current increased

I thought it always had all the voltage right there.

The current is supposed to go up, according to predictions.

 

Seeking patterns

Was VR1 always 0?

So forward voltage is one category, reverse is another?

Do you have the same figures for positive and negative voltage? [Reply] Well, let’s compare.

So they’re about the same, what’s the reason for that?

I think there’s something wrong there.

So we can’t compare these to each other.

What I did was use Ohm’s Law, that you have to do that for each point individually.

I think the resistance will decrease because…

Diodes are crazy!

It probably works like a switch.

The past semester has been a tough slog with my first-year class.   I’m slowly figuring out what resources and approaches were missing.  Last year, I launched myself headfirst (and underprepared) into inquiry-based learning because most of the class members were overflowing with significant, relevant questions.

This year, the students are barely asking questions at all, and when they do, the questions are not very relevant — they don’t help us move forward toward predicting circuit behaviour, troubleshooting, or any of the other expressed goals we’ve discussed as a class. They’re mostly about electrical safety which, don’t get me wrong, is important, but talking about how people do and don’t get electrocuted has limited value in helping us understand amplifiers.  I felt like I juiced those questions as much as I could, but it only led to more questions about house wiring and car chassis.

If I’m serious about inquiry-based learning, I have to develop a set of tools that allow me to adapt to the group.  Right now I feel like my approach only works if the group is already fairly skills at distinguishing between what we have evidence for and what we just feel like we’ve heard before, and asking significant questions that move toward a specific goal.  In other words, I wasn’t teaching them to reason scientifically, I was filtering out those who already knew from those who didn’t.  Here are some of the things I need to be more prepared for.

Measurement technique

I have never had so much trouble getting students to use their meters correctly.  Here we are in second semester, and I still have students confidently using incorrect settings.  I’d be happier if they were unsure, or had questions, but no, many are not noticing that they have problems with this.  And I don’t mean being confused about whether you should measure 1.5V on the 20V or the 2000 mV setting… I mean measuring 0.1 Ohms on the 200 KOhm setting.

I switched this year to teaching them about current first, rather than resistance (like I did last year).  I’m loathe to reconsider because current is the only one that lends itself to causal thinking and sense-making early in the year (try explaining resistance to someone who doesn’t know what current is… and “electric potential,” to someone who doesn’t know anything formal about energy or force or fields, is just hell).  Could this be part of why they’re struggling so much to use their meters correctly?  Is there something about the “current first” approach that bogs them down with cognitive load at a stage when they just need some repetitive practice?  I’m curious to check out the CASTLE curriculum, maybe over the summer, to try to figure some of this out.

I created a circuit-recording template last fall that I thought was such a great idea… it had a checklist at the top to help the students notice if they’d forgotten anything.  Guess what?  They started measuring without thinking about the meaning of the measurements — measuring as if it was just something to be check off a list!  No observations.  No questions. No surprise at unusual or unintuitive numbers.  Damn.  The checklist is gone and never coming back — next year I’ll make sure we only measure things that the students have found a reason to measure.

Last term, I waited far too long to give the quiz on measurement technique.  I knew they weren’t ready, and I kept thinking that if we spent more time practicing measuring (while exploring the questions we had painstakingly eked out), that it would get better.  Finally, we were so far behind that I gave the quiz anyway.  The entire class failed it (not a catastrophe, given the reassessment policy), and the most common comment when we reviewed the quiz was “why didn’t you tell us this before??”  Uh.  Right.  Quiz early, quiz often.

Guess what the teacher wants

The degree of “teacher-pleasing” being attempted is disheartening.  Students are almost always uncomfortable making mistakes, using the word “maybe” in situations where it is genuinely the most accurate way to express the strength of our data, or re-evaluating what they think of as “facts.”  But this is unusual.  There’s a high rate of students anxiously making up preposterous answers rather than saying “I don’t know.”

I tend toward a pretty aggressive questioning style — the kind of “what causes that, why does that happen” bluntness I would use with colleagues to bat ideas around.  I’ve changed my verbal prompt to “what might cause that?” and “what could possibly be happening” in the hopes that it would help students discern whether they are certain or not, and also help them transition toward communicating the tentativeness of ideas for which we have little evidence.  Obviously, I take care to draw out the reasoning and evidence in support of ideas, regardless of whether they’re canonical or not, and conversely make sure we discuss evidence against all of our ideas, including the “right” ones. I try to honour students’ questions by tracking them and letting them choose from among the class’s questions when deciding what to investigate next.  But valuing their questions and thinking is clearly not enough.

I gave a test question last semester that asked students to evaluate some “student” reasoning.  It used the word “maybe” in a completely appropriate way, and that’s what I heard outraged responses about from half the class.  They thought the reasoning was poor (and also reported that it was badly written!) because of it.  Again, we practiced explicitly, but sometimes I feel like I’m undermining their faith in “right answer” reasoning without helping them replace it with something better…

On the odd occasion when I ask someone a question and they say “I don’t know,” I make a point of not putting them on the spot, but of gathering info/evidence/ideas from other students for the first student to choose from, or breaking the class into small groups and asking them to discuss.  I try to make sure that the person who said “I don’t know” has as few negative consequences as possible.  Yet the person who says it inevitably looks crestfallen.

Talking in class

The frequency of students speaking up in class is at an all-time low.  I wonder if this has been influenced by my random cold-calling — they figure I’ll call on them eventually so there’s no sense putting their hand up to make a comment or ask a question?  The thing is, they don’t ask those questions when I call on them — just answer the question I ask.

At the same time, the frequency of whispered side conversations is at an all-time high, whether the speaker with the floor is me or another student.  I think I’m unusually sensitive to this — I find it completely distracting, and can barely maintain my train of thought if students are whispering to each other.  Maybe that’s partly my hearing, which is fairly acute — I can actually hear their whole conversation, even if they’re whispering at the back of the room (keep in mind that there are only 17 people and the room is pretty small).  So my standard response to this is one warning during class (followed by a quiet, private conversation after class) — if it happens again, they’re leaving the room.  Is this part of why they’re afraid to talk out loud — because I crack down on the talking under their breath?  I’m open to other ways of responding but out of ideas at the moment.

Causality

Even the strongest students are still having trouble explaining causes of physical effects.  They know I won’t accept a formula as a cause, but they can’t explain why, and when I ask someone to explain a cause, they will consistently give a formula anyway (figuring that an answer is always better than no answer, I guess).  Next approaches: asking them to write down the cause, discuss in groups

Scientific Discourse

As Jason articulates clearly, I think that my students need more help motivating and strengthening their scientific discourse.  He summarizes a promising-sounding approach called Guided Reciprocal Questioning as follows:

  1. Learn about something.
  2. Provide generic question frames.
  3. Students generate questions individually.
  4. Students discuss the questions in their groups.
  5. Share out.

I do something similar to #1-3, but I’m ready to try #4-5, with appropriate “discussion frames”, to see if I can help the students hold each other accountable to their knowledge.  Right now, they barely propose questions or answers, but when they do, the class seems to accept it, even if it contradicts something else we just talked about.

Also, Janet Abercrombie wrote recently in the comments about a Question Formulation Technique that I’d like to look into some more.

Conclusion: It works anyway

The whole experience was kind of heart-breaking.  But the conversations with students kept convincing me that I had to do it anyway.  I don’t know how many students took the time to say to me, “whoa, it seems like you actually want us to understand this stuff.”  The look of astonishment really said it all.  The bottom line is, this group is a much better test of the robustness of my methods than last year’s group could be.

 

Last year, I accidentally fell into an inquiry-driven style of teaching.  This year, I set out to do it on purpose.  Like Brian Frank’s example of students who do worse on motion problems after learning kinematics equations, my performance went down.  Unlike in that example, though, inquiry is a sense-making tool for the teacher, not just the students, so I’m doing more sense-making, not less.  The upshot: my awareness has increased while my performance has decreased.  (The proper spelling is A-N-X-I-E-T-Y).

Things that improved

I added a unit called “Thinking Like a Technician,” where students practice summarizing, clarifying, and identifying physical causes, using everyday examples.  When we got to atomic theory, they were less freaked out by the kind of sense-making I was asking them to do.

I started using a whiteboard template, based on Jason’s writing about Claim-Evidence Reasoning.  Like Jason, I introduced it to students as “Evidence-Claim-Reasoning.”  The increased organization of whiteboards makes things flow more smoothly for whiteboard authors when the discussion happens a few days after the analysis.  The standard layout lowers the cognitive load for students in the audience, since they know what to expect and look for.

The major tactical error I made

Last year I started with magnets and right away focused on students’ ideas about how atoms cause magnetic phenomena.  That means that our first area of inquiry was atoms.  This year, I thought I was being smart and started by digging into what students wondered about electricity.  BIG MISTAKE.  Students wonder a lot about electricity — mostly about how you can get electrocuted, or how to give someone else a shock. It was fascinating reading for me, but they have absolutely no tools for making sense of the answers to their questions.  The conventional wisdom about “electrons always seek ground” and “electricity always takes the path of least resistance” doesn’t help.  Since they start with neither foundational knowledge about electrons nor measurement technique with a multimeter, their attempts to either research or measure their way towards coherent ideas were random and pretty fruitless. As usual, Brian sums it up — I had backed us into a corner where “this makes no sense and right now we have no tools for making sense of it.

We are finally recovering (about 6 weeks later… *sigh*).  Some useful things got accomplished in the mean time — noticing and measuring the discrepancies between meters, figuring out some things about batteries along the way (which will help in the next unit).  Note for next time: start with atoms.  Atoms are in concrete things like chairs and sweaters — it avoids the need to start with the jumble of ideas called “electricity” (power/charge/energy/voltage/current/potential/etc.).  Also, give a quiz earlier on about meter technique.  It helped students strengthen understandings that would have been helpful a month ago.

Last thoughts

When engaging in a new strategy (whether for students or me), make sure it has some form of sense-making built-in.

Also, make sure the rest of life is not chaotic and stressful while doing these experiments.  The existential angst can be a bit much.

I’ve been frustrated lately by my lack of focus and difficulty getting things done.  After accidentally venting on my public blog (rather than the private one I intended to use … *sigh*) I realized there were a few factors at play that could shed some light on my students’ experiences.

1 — High stakes can reduce performance.

The beginning of the year feels high-stakes to me because it’s the time when students are forming their first impressions, the time when expectations get set and rapport gets built.   I’m not saying that those things can’t change over the course of the year.  But I think it’s a lot easier to set an initial expectation than to correct it later, especially about my wacky grading system, my insistence that students “not believe their teachers,” and so on.
There are a bunch of fixes for this.  One is to trust that my intro and orientation activities (videos, Marshmallow Challenge, name game, Teacher Skill Sheet, etc.) set good groundwork for productive classroom culture.  These activities are well-defined — I can print out last year’s agenda and have a decent first week, which should lower the stakes on my successive lesson plans.  Another is to document more carefully what I’ve done, so that next year, when I’m going batty with all the beginning of the year logistics, I don’t add lesson planning to the cognitive load.

How this applies to my students: There are lots of situations that they see as high-stakes and in which they underperform (or just procrastinate their way out of).  Tests, scholarship applications, job applications.  Tests are now pretty low-stakes, but it would be great to do the same for job applications, interviews, etc. — maybe by staging a series of “early drafts.”

2 — Success can cause fear of failure

I’m really proud of what my inquiry class accomplished last year.  The same ideas about evaluating claims and making well-supported inferences run through not just the content but the process.  The classroom culture was better than I could have expected.  I want to do the same thing this year.  The only problem is that it caught me so off guard last year that very little is documented (certainly no daily lesson plans or learning activities for the first couple of months — just jot notes of my impressions or student comments).  It’s immobilizing to imagine doing it again without instructions — what if they fail to buy in to the entire inquiry approach?

It feels like there’s a narrow range of introductions that make everything work out, and if I miss it, I’ll have to go back to lecturing.  Hey, stop that laughing!  I know, I rail against my students’ unwillingness to do things without instructions.  In my defense, there is a small difference: they can reassess their lab skill over and over within a few days.  Whatever I do with my class, it affects their trust in me in ways that cannot be fully undone, and I don’t get to reassess that particular moment until next year.

Fix: document my learning activities thoroughly this year.  Next year I might modify them or toss them out, but at least they’ll be there for those days when I just need to repeat something.

How this relates to my students: I’m not sure what to do here besides what I’m already doing: each assessment attempt is low-stakes, and there’s a wide range of possible good answers for almost everything.  The feeling of having fluked into something can really mess with your head (even if, in my case, I think luck was a small element, dwarfed by hard work and obsessive preparation).

3 — No net.

It feels like there’s no net because the peer-reviewed research community” setup I’m using depends heavily on the good-will of the students.  If any significant chunk decided to zone out, the system would not work.  If there aren’t a critical mass of students writing up papers and giving feedback, then there simply is no course.  If I had a group where absolutely no one was willing to make a good-faith effort, then I suppose I could lecture and assign problem sets (yes, I kept them from my first year).  The reality is that that’s unlikely to happen.  My students tend to be highly-motivated and with a wide age range (the oldest easily double the age of the youngest).  They appreciate being trusted to think.

Fix: no fix needed.  Especially in a group of 17 (as I have this year).

I wonder what kinds of things  in my students’ lives feel like there is no net?

Archives

I’M READING ABOUT