You are currently browsing the category archive for the ‘Perplexed’ category.
This just in from dy/dan: Jo Boaler (Stanford prof, author of What’s Math Got to Do With It and inspiration for Dan Meyer’s “pseudocontext” series) is offering a free online course for “teachers and other helpers of math learners.” The course is called “How To Learn Math.”
“The course is a short intervention designed to change students’ relationships with math. I have taught this intervention successfully in the past (in classrooms); it caused students to re-engage successfully with math, taking a new approach to the subject and their learning. In the 2013-2014 school year the course will be offered to learners of math but in July of 2013 I will release a version of the course designed for teachers and other helpers of math learners, such as parents…” [emphasis is original]
I’ve been disheartened this year to realize how limited my toolset is for convincing students to broaden their thinking about the meaning of math. Every year, I tangle with students’ ingrained humiliation in the face of their mistakes and sense of worthlessness with respect to mathematical reasoning. I model, give carefully crafted feedback, and try to create low-stakes ways for them to practice analyzing mistakes, understanding why math in physics gives us only “evidence in support of a model” — not “the right answer”, and noticing the necessity for switching representations. This is not working nearly as well as it needs to for students to make the progress they need and that I believe they are capable of.
I hope this course will give me some new ideas to think about and try, so I’ve signed up. I’m especially interested in the ways Boaler is linking these ideas to Carol Dweck’s ideas about “mindset,” and proposing concrete ideas for helping students develop a growth mindset.
Anyone else interested?
What scientifically-honest questions can I ask my students to tangle with, based on their current ideas and expectations?
This question is at the heart of a lot of my classroom’s success and also anxiety. When I ask good questions, students are more likely to evaluate evidence thoroughly, seek contradictions, resolve those contradictions, hold each other and themselves accountable to what we know so far, and generate significant new questions for our next round of research.
A poorly-chosen question reveals itself when students don’t have enough information or skill to make sense of the information they find, or can’t think of ways to find information at all, or don’t care about the answer, or can’t see how it’s related to their goal of becoming an electronics tech.
I’m intrigued by what’s going on in this video, a clip of a TED talk featuring Bobby McFerrin (of “Don’t Worry Be Happy” fame, but also a brilliant performer of many genres). For maximum benefit, try singing along.
The question McFerrin asks himself seems to be, “what musically honest question can I ask this audience?”
The question he poses to the audience is, “What’s the next note?”
This question works because he was able to
- anticipate the ideas participants are likely to have about the topic (the pentatonic scale is surprisingly cross-cultural)
- anticipate which ideas are difficult to learn, and which ones are not (he avoids certain scale degrees and uses a tune that’s going to be structurally familiar to an American audience)
- choose a question that’s simple enough for people to make sense of using the tools they already have
- make the task interesting (and the big picture “audible”) by doing the more complicated work himself.
I’m getting much better at anticipating common initial ideas and eliciting my students’ thinking. I’m still not great at choosing the question, or choosing the right moments to suggest the question.
I’m not sure what I make of this. In the video, the participants are not exactly learning something new. They are realizing something they didn’t realize that they already knew. This doesn’t give me much insight into tackling the topics that are difficult to learn. But I keep thinking about it.
As the year winds down, I’m starting to pull out some specific ideas that I want to work on over the summer/next year. The one that presses on me the most is “readiness.” In other words,
- What is absolutely non-negotiable that my students should be able to do or understand when they graduate?
- How to I make sure they get the greatest opportunity to learn those things?
- How do I make sure no one graduates without those things? And most frustratingly,
- How do I reconcile the student-directedness of inquiry learning with the requirements of my diploma?
Some people might disagree that some of these points are worth worrying about. If you don’t teach in a trade school, these questions may be irrelevant or downright harmful. K-12 education should not be a trade school. Universities do not necessarily need to be trade schools (although arguably, the professional schools like medicine and law really are, and ought to be). However, I DO teach in a trade school, so these are the questions that matter to me.
Training that intends to help you get a job is only once kind of learning, but it is a valid and important kind of learning for those who choose it. It requires as much rigour and critical thinking as anything else, which becomes clear when we consider the faith we place in the electronics technicians who service elevators and aircraft. If my students are inadequately prepared in their basic skills, they (or someone else, or many other people) may be injured or die. Therefore, I will have no truck with the intellectual gentrification that thinks “vocational” is a dirty word. Whether students are prepared for their jobs is a question of the highest importance to me.
In that light, my questions about job-readiness have reached the point of obsession. Being a technician is to inquire. It is to search, to question, to notice inconsistencies, to distinguish between conditions that can and cannot possibly be the cause of particular faults. However, teaching my students to inquire means they must inquire. I can’t force it to happen at a particular speed (although I can cut it short, or offer fewer opportunities, etc.). At the same time, I have given my word that if they give me two years of their time, they will have skills X, Y, and Z that are required to be ready for their jobs. I haven’t found the balance yet.
I’ll probably write more about this as I try to figure it out. In the meantime, Grant Wiggins is writing about a recent study that found a dramatic difference between high-school teachers’ assessment of students’ college readiness, and college profs’ assessment of the same thing. Wiggins directs an interesting challenge to teachers: accurately assess whether students are ready for what’s next, by calibrating our judgement against the judgement of “whatever’s next.” In other words, high school teachers should be able to predict what fraction of their students are adequately prepared for college, and that number should agree reasonably well with the number given by college profs who are asked the same question. In my case, I should be able to predict how well prepared my students are for their jobs, and my assessment should match reasonably the judgement of their first employer.
In many ways I’m lucky: we have a Program Advisory Group made up of employer representatives who meet to let us know what they need. My colleagues and I have all worked between 15 and 25 years in our field. I send all my students on 5-week unpaid work terms. During and after the work terms, I meet with the student and the employer, and get a chance to calibrate my judgement. There’s no question that this is a coarse metric; the reviews are influenced by how well the student is suited to the culture of a particular employer, and their level of readiness in the telecom field might be much higher than if they worked on motor controls. Sometimes employers’ expectations are unreasonably high (like expecting electronics techs to also be mechanics). There are some things employers may or may not expect that I am adamant about (for example, that students have the confidence and skill to respond to sexist or racist comments). But overall, it’s a really useful experience.
Still, I continue to wonder about the accuracy of my judgement. I also wonder about how to open this conversation with my colleagues. It seems like something it would be useful to work on together. Or would it? The comments on Wiggins’ post are almost as interesting as the post itself.
It seems relevant that most commenters are responding to the problem of students’ preparedness for college, while Wiggins is writing about a separate problem: teachers’ unfounded level of confidence about students’ preparedness for college.
The question isn’t, “why aren’t students prepared for college.” It’s also not “are college profs’ expectations reasonable.” It’s “why are we so mistaken about what college instructors expect?“
My students, too, often miss this kind of subtle distinction. It seems that our students aren’t the only ones who suffer from difficulty with close reading (especially when stressed and overwhelmed).
Wiggins calls on teachers to be more accurate in our assessment, and to calibrate our assessment of college-readiness against actual college requirements. I think these are fair expectations. Unfortunately, assessment of students’ college-readiness (or job-readiness) is at least partly an assessment of ourselves and our teaching.
A similar problem is reported about college instructors. The study was conducted by the Foundation for Critical Thinking with both education faculty and subject-matter faculty who instruct teacher candidates. They write that many profs are certain that their students are leaving with critical thinking skills, but that most of those same profs could not clearly explain what they meant by critical thinking, or give concrete examples of how they taught it.
Self-assessment is surprisingly intractable; it can be uncomfortable and can elicit self-doubt and anxiety. My students, when I expect them to assess their work against specific criteria, exhibit all the same anger, defensiveness, and desire to change the subject as seen in the comments. Most of them literally can’t do it at first. It takes several drafts and lots of trust that they will not be “punished” for admitting to imperfection. Carol Dweck’s work on “growth mindset” comes to mind here… is our collective fear of admitting that we have room to grow a consequence of “fixed mindset”? If so, what is contributing to it? In that light, the punitive aspects of NCLB (in the US) or similar systemic teacher blaming, isolation, and lack of integrated professional development may in fact be contributing to the mis-assessment reported in the study, simply by creating lots of fear and few “sandboxes” of opportunity for development and low-risk failure. As for the question of whether education schools are providing enough access to those experiences, it’s worth taking a look at David Labaree’s “The Trouble with Ed School.”
One way to increase our resilience during self-assessment is to do it with the support of a trusted community — something many teachers don’t have. For those of us who don’t, let’s brainstorm about how we can get it, or what else might help. Inaccurate self-assessment is understandable but not something we can afford to give up trying to improve.
I’m interested in commenter I Hodge’s point about the survey questions. The reading comprehension question allowed teachers to respond that “about half,” “more than half,” or “all, or nearly all” of their students had an adequate level of reading comprehension. In contrast, the college-readiness question seems to have required a teacher to select whether their students were “well,” “very well,” “poorly,” or “very poorly” prepared. This question has no reasonable answer, even if teachers are only considering the fraction of students who actually do make it to college. I wonder why they posed those two questions so differently?
Last but not least, I was surprised that some people blamed college admissions departments for the admission of underprepared students. Maybe it’s different in the US, but my experience here in Canada is that admission is based on having graduated high school, or having gotten a particular score in certain high school courses. Whether under-prepared students got those scores because teachers under-estimated the level of preparation needed for college, or because of rigid standards or standardized tests or other systemic problems, I don’t see how colleges can fix, other than by administering an entrance test. Maybe that’s more common than I know, but neither the school at which I teach nor the well-reputed university that I (briefly) attended had one. Maybe using a high school diploma as the entrance exam for college/university puts conflicting requirements on the K-12 system? I really don’t know the answer to this.
Wiggins recommends regularly bringing together high-school and college faculty to discuss these issues. I know I’d be all for it. There is surely some skill-sharing that could go back and forth, as well as discussions of what would help students succeed in college. Are we ready for this?
Some interesting comments on my recent post about causal thinking have got my wheels turning. It puts me in mind of the conversation at Overthinking My Teaching about whether “repeated addition” is the best way to approach teaching exponents. In that post, Christopher Danielson points out the helpfulness of shifting from “Why is Approach X wrong” or even “Which approach is correct” toward “What is gained and lost when using Approach X?“
In that light, I’m thinking back on my post and the comments. For example:
I talk about the difference between “who/what you are” (the definition of you) and “what caused you” (a meeting of sperm and egg). In the systems of belief that my students tend to have, people are not thought to “just happen” or “cause themselves.” It can help open the conversation. However, even when I do this, they are surprisingly unlikely to transfer that concept to atomic particles.
“Purpose is a REAL facet in all of nature because everything has a natural function e.g., the role of mitochondria in eukaryotic cells is ATP production, or that the nature of negatively charged electrons is to attract and repel + and – charged particles respectively, etc.”
But I think it’s the same mistake to presume that they really *mean* that the electron has desires and wants, which is a slippery slope to thinking they *can’t* access or feel the need to explore the deeper causal relationships.
I’m noticing that there are ideas I expect students to extend from humans to particles (forces can act on us), and ideas I expect them to find not-extensible (desire). These examples are the easy ones; “purpose” is harder to place clearly in one category or the other, and “cause” probably belongs in both categories but means something different in each. I need to think more clearly about which ones are which and why, and how to help students develop their own skills for distinguishing.
I’m trying to stop assuming that when students talk about electrons’ “desires,” that they are referring to a deeper story; I also need to avoid assuming that they are not, or that they don’t want to/aren’t drawn to.
I’m on a personal “fast” of discussing electrons’ purposes and desires, at least while I’m in earshot of my students. It’s hard to break those habits, exactly because they are so helpful. However, it has the useful result that all the ideas about purpose and desires that are getting thrown around in class come from the students. The students seem more willing to question them than when the ideas come from me. Unfortunately they are having a really hard time understanding each other’s metaphors (even though the metaphors are not particularly far-fetched, by my reckoning), and I’m having a really hard time facilitating the conversation to help them see each other’s point of view. But that still seems better than before, when the metaphors were not getting questioned at all, and maybe not even noticed as metaphors.
Michael Pershan kicked my butt recently with a post about why teachers tend to plateau in skill after their third year, connecting it to Cal Newport’s ideas such as “hard practice” (and, I would argue, “deep work“).
Michael distinguishes between practice and hard practice, and wonders whether blogging belongs on his priority list:
“Hard practice makes you better quickly. Practice lets you, essentially, plateau. …Put it like this: do you feel like you’re a 1st year teacher when you blog? Does your brain hurt? Do you feel as if you’re lost, unsure how to proceed, confused?If not, you’re not engaged in hard practice.”
Ooof. On one hand, it made me face my desire to avoid hard practice; I feel like I’ve spent the last 8 months trying to decrease how much I feel like that. I’ve tried to create classroom procedures that are more reuseable and systematic, especially for labs, whiteboarding sessions, class discussions, and model presentations.
It’s a good idea to periodically take a hard look at that avoidance, and decide whether I’m happy with where I stand. In this case, I am. I don’t think the goal is to “feel like a first year teacher” 100% of the time; it’s not sustainable and not generative. But it reminds me that I want to know which activities make me feel like that, and consciously choose some to seek out.
Michael makes this promise to himself:
It’s time to redouble my efforts. I’m half way through my third year, and this would be a great time for me to ease into a comfortable routine of expanding my repertoire without improving my skills.
I’m going to commit to finding things that are intellectually taxing that are central to my teaching.
It made me think about what my promises are to myself.
Be a Beginner
Do something every summer that I don’t know anything about and document the process. Pay special attention to how I treat others when I am insecure, what I say to myself about my skills and abilities, and what exactly I do to fight back against the fixed-mindset that threatens to overwhelm me. Use this to develop some insight into what exactly I am asking from my students, and to expand the techniques I can share with them for dealing with it.
Last summer I floored my downstairs. The summer before that I learned to swim — you know, with an actual recognizable stroke. In both cases, I am proud of what I accomplished. In the process, I was amazed to notice how much concentration it took not to be a jerk to myself and others.
Learn More About Causal Thinking
I find myself being really sad about the ways my students think about causality. On one hand, I think my recent dissections of the topic are a prime example of “misconceptions listening” — looking for the deficit. I’m pretty sure my students have knowledge and intuition about cause that I can’t see, because I’m so focused on noticing what’s going wrong. In other words, my way of noticing students’ misconceptions is itself a misconception. I’d rather be listening to their ideas fully, doing a better job of figuring out what’s generative in their thinking.
What to do about this? If I believe that my students need to engage with their misconceptions and work through them, then that’s probably what I need too. There’s no point in my students squashing their misconceptions in favour of “right answers”; similarly, there’s no point in me squashing my sadness and replacing it with some half-hearted “correct pedagogy.”
Maybe I’m supposed to be whole-heartedly happy to “meet my students where they are,” but if I said I was, I’d be lying. (That phrase has been used so often to dismiss my anger at the educational malpractice my students have endured that I can’t even hear it without bristling). I need to midwife myself through this narrow way of thinking by engaging with it. Like my students, I expect to hold myself accountable to my observations, to good-quality reasoning, to the ontology of learning and thinking, and to whatever data and peer feedback I can get my hands on.
My students’ struggle with causality is the puzzle from which my desire for explanation emerged; it is the source of the perplexity that makes me unwilling to give up. I hope that pursuing it honestly will help me think better about what it’s like when I ask my students to do the same.
Interact with New Teachers
Talking with beginning teachers is better than almost anything else I’ve tried for forcing me to get honest about what I think and what I do. There’s a new teacher in our program, and talking things through with him has been a big help in crystallizing my thoughts (mutually useful, I think). I will continue doing this and documenting it. I also put on a seminar on peer assessment for first-year teachers last summer; it was one of the more challenging lesson plans I’ve ever written. If I have another chance to do this, I will.
Work for Systemic Change
I’m not interested in strictly personal solutions to systemic problems. I won’t have fun, or meet my potential as a teacher, if I limit myself to improving me. I want to help my institution and my community improve, and that means creating conditions and communities that foster change in collective ways. For two years, I tried to do a bit of this via my campus PD committee; for various reasons, that avenue turned out not to lead in the directions I’m interested in going. I’ve had more success pressing for awareness and implementation of the Workplace Violence Prevention regulations that are part of my local jurisdiction’s Occupational Health and Safety Act.
I’m not sure what the next project will be, but I attended an interesting seminar a few months ago about our organization’s plans for change. I was intrigued by the conversations happening about improving our internal communication. I’ve also had some interesting conversations recently with others who want to push past the “corporate diversity” model toward a less ahistorical model of social justice or cultural competence. I’ll continue to explore those to find out which ones have some potential for constructive change.
Design for Breaks
I can’t do this all the time or I won’t stay in the classroom. I know that now. As of the beginning of January, I’ve reclaimed my Saturdays. No work on Saturdays. It makes the rest of my week slightly more stressful, but it’s worth it. For the first few weeks, I spent the entire day alternately reading and napping. Knowing that I have that to look forward to reminds me that the stakes aren’t as high as they sometimes seem.
I’m also planning to go on deferred leave for four months starting next January. After that, I’ve made it a priority to find a way to work half-time. The kind of “intellectually taxing” enrichment that I need, in order for teaching to be satisfying, takes more time than is reasonable on top of a full-time job. I’m not willing to permanently sacrifice my ability to do community volunteer work, spend time with my loved ones, and get regular exercise. That’s more of a medium-term goal, but I’m working a few leads already.
Anyone have any suggestions about what I should do with 4 months of unscheduled time starting January 2014?
The past semester has been a tough slog with my first-year class. I’m slowly figuring out what resources and approaches were missing. Last year, I launched myself headfirst (and underprepared) into inquiry-based learning because most of the class members were overflowing with significant, relevant questions.
This year, the students are barely asking questions at all, and when they do, the questions are not very relevant — they don’t help us move forward toward predicting circuit behaviour, troubleshooting, or any of the other expressed goals we’ve discussed as a class. They’re mostly about electrical safety which, don’t get me wrong, is important, but talking about how people do and don’t get electrocuted has limited value in helping us understand amplifiers. I felt like I juiced those questions as much as I could, but it only led to more questions about house wiring and car chassis.
If I’m serious about inquiry-based learning, I have to develop a set of tools that allow me to adapt to the group. Right now I feel like my approach only works if the group is already fairly skills at distinguishing between what we have evidence for and what we just feel like we’ve heard before, and asking significant questions that move toward a specific goal. In other words, I wasn’t teaching them to reason scientifically, I was filtering out those who already knew from those who didn’t. Here are some of the things I need to be more prepared for.
I have never had so much trouble getting students to use their meters correctly. Here we are in second semester, and I still have students confidently using incorrect settings. I’d be happier if they were unsure, or had questions, but no, many are not noticing that they have problems with this. And I don’t mean being confused about whether you should measure 1.5V on the 20V or the 2000 mV setting… I mean measuring 0.1 Ohms on the 200 KOhm setting.
I switched this year to teaching them about current first, rather than resistance (like I did last year). I’m loathe to reconsider because current is the only one that lends itself to causal thinking and sense-making early in the year (try explaining resistance to someone who doesn’t know what current is… and “electric potential,” to someone who doesn’t know anything formal about energy or force or fields, is just hell). Could this be part of why they’re struggling so much to use their meters correctly? Is there something about the “current first” approach that bogs them down with cognitive load at a stage when they just need some repetitive practice? I’m curious to check out the CASTLE curriculum, maybe over the summer, to try to figure some of this out.
I created a circuit-recording template last fall that I thought was such a great idea… it had a checklist at the top to help the students notice if they’d forgotten anything. Guess what? They started measuring without thinking about the meaning of the measurements — measuring as if it was just something to be check off a list! No observations. No questions. No surprise at unusual or unintuitive numbers. Damn. The checklist is gone and never coming back — next year I’ll make sure we only measure things that the students have found a reason to measure.
Last term, I waited far too long to give the quiz on measurement technique. I knew they weren’t ready, and I kept thinking that if we spent more time practicing measuring (while exploring the questions we had painstakingly eked out), that it would get better. Finally, we were so far behind that I gave the quiz anyway. The entire class failed it (not a catastrophe, given the reassessment policy), and the most common comment when we reviewed the quiz was “why didn’t you tell us this before??” Uh. Right. Quiz early, quiz often.
Guess what the teacher wants
The degree of “teacher-pleasing” being attempted is disheartening. Students are almost always uncomfortable making mistakes, using the word “maybe” in situations where it is genuinely the most accurate way to express the strength of our data, or re-evaluating what they think of as “facts.” But this is unusual. There’s a high rate of students anxiously making up preposterous answers rather than saying “I don’t know.”
I tend toward a pretty aggressive questioning style — the kind of “what causes that, why does that happen” bluntness I would use with colleagues to bat ideas around. I’ve changed my verbal prompt to “what might cause that?” and “what could possibly be happening” in the hopes that it would help students discern whether they are certain or not, and also help them transition toward communicating the tentativeness of ideas for which we have little evidence. Obviously, I take care to draw out the reasoning and evidence in support of ideas, regardless of whether they’re canonical or not, and conversely make sure we discuss evidence against all of our ideas, including the “right” ones. I try to honour students’ questions by tracking them and letting them choose from among the class’s questions when deciding what to investigate next. But valuing their questions and thinking is clearly not enough.
I gave a test question last semester that asked students to evaluate some “student” reasoning. It used the word “maybe” in a completely appropriate way, and that’s what I heard outraged responses about from half the class. They thought the reasoning was poor (and also reported that it was badly written!) because of it. Again, we practiced explicitly, but sometimes I feel like I’m undermining their faith in “right answer” reasoning without helping them replace it with something better…
On the odd occasion when I ask someone a question and they say “I don’t know,” I make a point of not putting them on the spot, but of gathering info/evidence/ideas from other students for the first student to choose from, or breaking the class into small groups and asking them to discuss. I try to make sure that the person who said “I don’t know” has as few negative consequences as possible. Yet the person who says it inevitably looks crestfallen.
Talking in class
The frequency of students speaking up in class is at an all-time low. I wonder if this has been influenced by my random cold-calling — they figure I’ll call on them eventually so there’s no sense putting their hand up to make a comment or ask a question? The thing is, they don’t ask those questions when I call on them — just answer the question I ask.
At the same time, the frequency of whispered side conversations is at an all-time high, whether the speaker with the floor is me or another student. I think I’m unusually sensitive to this — I find it completely distracting, and can barely maintain my train of thought if students are whispering to each other. Maybe that’s partly my hearing, which is fairly acute — I can actually hear their whole conversation, even if they’re whispering at the back of the room (keep in mind that there are only 17 people and the room is pretty small). So my standard response to this is one warning during class (followed by a quiet, private conversation after class) — if it happens again, they’re leaving the room. Is this part of why they’re afraid to talk out loud — because I crack down on the talking under their breath? I’m open to other ways of responding but out of ideas at the moment.
Even the strongest students are still having trouble explaining causes of physical effects. They know I won’t accept a formula as a cause, but they can’t explain why, and when I ask someone to explain a cause, they will consistently give a formula anyway (figuring that an answer is always better than no answer, I guess). Next approaches: asking them to write down the cause, discuss in groups
As Jason articulates clearly, I think that my students need more help motivating and strengthening their scientific discourse. He summarizes a promising-sounding approach called Guided Reciprocal Questioning as follows:
- Learn about something.
- Provide generic question frames.
- Students generate questions individually.
- Students discuss the questions in their groups.
- Share out.
I do something similar to #1-3, but I’m ready to try #4-5, with appropriate “discussion frames”, to see if I can help the students hold each other accountable to their knowledge. Right now, they barely propose questions or answers, but when they do, the class seems to accept it, even if it contradicts something else we just talked about.
Also, Janet Abercrombie wrote recently in the comments about a Question Formulation Technique that I’d like to look into some more.
Conclusion: It works anyway
The whole experience was kind of heart-breaking. But the conversations with students kept convincing me that I had to do it anyway. I don’t know how many students took the time to say to me, “whoa, it seems like you actually want us to understand this stuff.” The look of astonishment really said it all. The bottom line is, this group is a much better test of the robustness of my methods than last year’s group could be.
Series circuits are one of the foundational concepts in electrical work, and one of the first things students build/think about/get assessed on in their first months at school. My definition of two series components:
- Two components are in series if all the current in one flows into the second, and all the current in the second comes from the first
Things I have heard about series components:
- Components are in series if they’re in a square shape
- Components are in series if all the current in one flows into the second
- Components are in series if they’re both connected to the power supply
- Components are in series if they’re aligned in a straight line
In the first year of the program, we spend a lot of time refining our ideas about which circuits have which behaviours. We refine and revise and throw out ideas. By the end of December we should have something fairly strong.
Last week, I had a second-year student tell me he knew that two components were in series because of reason #3 above. I’m struggling to make sense of this, and the accountability of teaching in a trade school hangs over my head like the razor-edged pendulum in the pit. In May, some of these students will be working on large-scale industrial robots. These things weigh tons, carry blades and torches, and can maim or kill people in an instant. Electronics is not an apprenticeable trade. Grads will not carry tools for a journeyman for three years — they get put right to work. Also, electronics is not a construction trade — it is a repair trade. That means that work is almost always done under pressure of short timelines and lost money — the electronics tech doesn’t get called out until something is broken.
I have two years to make sure they are ready to at least begin their industry-specific training. It’s not good enough for them to sometimes make sense of things — they need nail these foundational concepts every time in order to to use the training the employer provides and make good judgement calls on the job. Please, no comments about how education is about broadening the mind and this student is learning lots of other valuable skills. While that’s true, it’s not currently the point. When that electronics tech does some repairs on the heart-rate monitor keeping tabs on your unborn child, you are not going to be any more interested in the tech’s broad mind than I am.
What does it mean if a student can spend 4 months in DC circuits, not fully integrate the concept of series components, pass the course, and 8 months later still have an unstable concept?
Here are all the ideas I can think of at the moment. Don’t panic — I don’t think these are all equally likely.
- Their experience in DC circuits is not doing enough to help them make sense of this idea
- The assessments in DC circuits are not rigourous enough to catch students who are still unsure about this
- This student is incapable of consistently making sense of this idea, and should not have been accepted into the program in the first place
- It’s normal for students to form, unform, and reform their ideas about new concepts. It’s inevitable, and sometimes students will revert to previous ways of thinking even after the fantastic course and the rigourous assessments.
If it’s #1, I’m not sure what to do. I’ve already given over my courses to sense-making, critical thinking, and inquiring. Do they need more class hours, more time outside class hours, or just different kinds of practice? Maybe the practice problems are too consistent, failing to address students’ misconceptions.
If it’s #2, I’m not sure what to do. I feel pretty confident that I’m assessing their reasoning rather than their regurgitating. More assessments might help — not sure where to get the time. A final exam might help. I can’t see my way clear to passing or failing someone on the strength of a final exam, but I’d at least know a bit more about which concepts are still shaky. I’ve sometimes given a review paper in January on the concepts learned in the previous semester, and worked through multiple drafts — I could start doing that again.
If it’s #3, I’m definitely not sure what to do.
If it’s #4, how do I reconcile this with my sense of personal responsibility to not send them out to get injured or injure someone else? I realize I’ve framed this in a fairly dramatic way, and not every student who’s unsure of what a series circuit is will end up harming someone. It’s much more likely that they’ll end up on the job and start to consolidate their knowledge and clear up their misconceptions. However, it’s also likely that they’ll end up on a job where they suddenly realize that they don’t understand the basic things they’re being asked to do. This bodes poorly for the grad’s confidence and enjoyment of their career, the employer’s willingness to hire future grads, and of course the quality of our biomedical equipment, manufacturing equipment, navigational equipment, power generation instrumentation, … . It also bodes poorly for my ability to believe that I am doing a reasonable job.
Last February, I had a conversation with my first-year students that changed me.
On quizzes, I had been asking questions about what physically caused this or that. The responses had a weird random quality that I couldn’t figure out. On a hunch, I drew a four-column table on the board, like this:
I gave the students 15 minutes to write whatever they could think of.
I collect the answer for “cause” a write them all down. Nine out of ten students said that a difference of electrical energy levels causes voltage. This is roughly like saying that car crashes are caused by automobiles colliding.
Me: Hm. Folks, that’s what I would consider a “definition.” Voltage is just a fancy word that means “difference of electrical energy levels” — it’s like saying the same thing twice. Since they’re the same idea, one can’t cause the other — it’s like saying that voltage causes itself.
Student: so what causes voltage — is it current times resistance?
Me: No, formulas don’t cause things to happen. They might tell you some information about cause, and they might not, depending on the formula, but think about it this way. Before Mr. Ohm developed that formula, did voltage not exist? Clearly, nature doesn’t wait around for someone to invent the formula. Things in nature somehow happen whether we calculate them or not. One thing that can cause voltage is the chemical reaction inside a battery.
Other student: Oh! So, that means voltage causes current!
Me: Yes, that’s an example of a physical cause. [Trying not to hyperventilate. Remember, it's FEBRUARY. We theoretically learned this in September.]
Me: So, who thinks they were able to write a definition?
Students: [explode is a storm of expostulation. Excerpts include] “Are you kidding?” “That’s impossible.” “I’d have to write a book!” “That would take forever!”
Me: [mouth agape] What do you mean? Definitions are short little things, like in dictionaries. [Grim realization dawns.] You use dictionaries, right?
Students: [some shake heads, some just give blank looks]
Me: Oh god. Ok. Um. Why do you say it would take forever?
Student: How could I write everything about voltage? I’d have to write for years.
Me: Oh. Ok. A definition isn’t a complete story of everything humanly known about a topic. A definition is… Oh jeez. Now I have to define definition. [racking brain, settling on "necessary and sufficient condition," now needing to find a way to say that without using those words.] Ok, let’s work with this for now: A definition is when you can say, “Voltage means ________; Whenever you have ___________, that means you have voltage.”
Students: [furrowed brows, looking amazed]
Me: So, let’s test that idea from earlier. Does voltage mean a difference in electrical energy levels? [Students nod] Ok, whenever you have a difference in electrical energy levels, does that mean there is voltage? [Students nod] Ok, then that’s a definition.
Third student: So, you flop it back on itself and see if it’s still true?
Me: Yep. ["Flopping it back on itself" is still what I call this process in class.] By the way, the giant pile of things you know about voltage, that could maybe go in the “characteristics” column. That column could go on for a very long time. But cause and definition should be really short, probably a sentence.
Students: [Silent, looking stunned]
Me: I think that’s enough for today. I need to go get drunk.
Ok, I didn’t say that last part.
When I realized that my students had lumped a bunch of not-very-compatible things together under “cause,” other things started to make sense. I’ve often had strange conversations with students about troubleshooting — lots of frustration and misunderstanding on both sides. The fundamental question of troubleshooting is “what could cause that,” so if their concept of cause is fuzzy, the process must seem magical.
I also realized that my students did not consistently distinguish between “what made you think that” and “what made that happen.” Both are questions about cause — one about the cause of our thinking or conclusions, and one about the physical cause of phenomena.
Finally, it made me think about the times when I hear people talk as though things have emotions and free will — especially high-tech products like computers are accused of “having a bad day” or “refusing to work.” Obviously people say things like that as a joke, but it’s got me thinking, how often do my students act as though they actually think that inanimate objects make choices? I need a name for this — it’s not magical thinking because my students are not acting as though “holding your tongue the right way” causes voltage. They are, instead, acting as though voltage causes itself. It seems like an ill-considered or unconscious kind of animism. I don’t want to insult thoughtful and intentional animistic traditions by lumping them in together, but I don’t know what else to call it.
Needless to say, this year I explicitly taught the class what I meant by “physical cause” at the beginning of the year. I added a metacognition unit to the DC Circuits course called “Technical Thinking” (a close relative of the “technical reading” I proposed over a year ago, which I gradually realized I wanted students to do whether they were reading, listening, watching, or brushing their teeth). Coming soon.
This morning, my students are reading about negative feedback and assessing the information provided using our standard rubric, which asks them to summarize and write their questions. They’re finding it difficult to understand, almost too confusing to summarize. I remind them that that’s ok — to summarize what they can, if they can. I also tell them to write questions as they read, not to wait until the end of the passage to write them down.
Especially, I remind them that common cause of “getting stuck” is waiting until they understand the paragraph before writing down a question. The problem, of course, is that you might not be able to understand the passage until after the question is answered. Waiting for understanding before asking questions is like waiting to be fit before going to the gym.
I have this conversation with one student:
Student: “What I’m afraid of is, if I get partway through the paragraph and write a question, then I get later in the paragraph and write down another question, I’ll get to the end and realize, Oh, that’s what it meant, and I won’t need to ask that question any more.”
Me, joking: “So what happens then? What horrible consequence ensues?”
Student: “I have to kill an eraser!”
Me: “No need to erase it. Just write a note that says, ‘oh, now I get that… [whatever you just understood]. Have you ever noticed how often I do that on your quizzes and papers? I write questions as I’m reading, then I cross them out when I get to the end and write a note that says “never mind, I see that you’ve answered the questions down here.”
Student: [noncommittal shrug, smiling, seems willing to try this]
I think that’s an ok way to get the point across. I sit back down. Then I need to be a smart ass. I go back to chat with the same student. “You know, from our conversation earlier, it sounded like you were saying, ‘I’m afraid that if I ask questions, I’ll get it.’ “
My point, of course, is that asking questions, thinking through our questions, and clarifying to ourselves what question we mean to ask can be an important part of sense-making, and can even help us answer our own questions. But that’s not how it comes across to the student. Now he’s been backed into a corner, shown the absurdity of something he just said. He scrambles to defend his statement. “No, what I meant was that if I ask questions while I’m reading, I might get to the end and not understand my… [pause] I can’t put it into words.”
Notes to self
- Students sometimes think they should delay asking questions until after they have understood something. This causes deadlock and frustration. Strategize about this with students.
- Pointing out someone’s misconception, especially in the middle of class, does not usually result in a graceful acknowledge of “oh, yeah, that doesn’t really make sense, does it?” It usually results in backpedaling and attempts to salvage the idea by re-interpreting, suggesting that I didn’t understand them, or saying “I understand it, I just can’t put it into words.”
- The phrase “I understand it but I just can’t put it into words” is highly correlated with “You just pointed out a misconception to me and now I must save face by avoiding your point at all costs.” Use this clue to improve.
- Dear Mylène, you think you’re too highly evolved to use “elicit-confront-resolve” to address student misconceptions, but you’re mistaken. It’s causing students to avoid their misconceptions instead of facing them. Find a way to do something else.
Today we were brainstorming ideas about electricity, practicing clarifying, and creating questions that start “What causes…”. Some students are anxious about this, seem to fear that if they ask those questions, they will have to answer them. My goal is for us to draw the boundaries of what we do and don’t know — not to get lost in some metaphysical endless loop. Facing the giant pile of what we don’t know is hard sometimes.
I say “If it seems like we’ll never run out of questions, don’t worry. We don’t have to answer those questions — we’re just keeping track of what we have and haven’t answered. And anyway, if we ran out of questions, wouldn’t that be awful and boring?”
The answer from the back of the room is, “No, that would be great. And then I’d be smart.”
What’s my next move?