You are currently browsing the category archive for the ‘Uncategorized’ category.

Last week in class, I showed some student examples of authentic, non-canonical thinking.  I asked the class to identify what they saw as good in those examples.  Here’s what they said:

“It’s honest.

It talks about electrons and energy.

It talks about physical cause.

It’s about the real world.

They noticed patterns.

They used analogies and metaphors.

They broke the ideas into parts.

They asked questions — clarifying and precision.

The were trying to find the limits of when things were true.

They said they didn’t know.

They proposed a hypothesis.

They said what seemed weird.

At a glance, it seems people wrote more than usual that day about what they think might be happening in their circuits.  I’ll post more when I read them… but I’m hopeful.

This morning, I’m presenting a workshop on Assessment and Evaluation Survival Skills.  The themes are

  • How to help students learn
  • While giving fair and accurate grades
  • Without losing your mind.

 

Stay tuned for an update on the questions and techniques that emerge.

Participant handout (DOC, PDF)

Feedback form template (DOC, PDF)

 

 

A year ago last week, I posted my first blog entry.  At that point, I had barely slept in 2 months. I was in my second semester of teaching, was working overtime to keep up with 2 groups of students in 4 preps, and spent most nights reading the archives of some teacher-blogging superheroes.  My jaw was permanently dropped because they were writing about things that I didn’t think were possible in a classroom.

By now, every one of those unreachable-seeming icons of genius pedagogy have commented here at least once, and I’ve met dozens more.  My blogroll has over 100 entries.  I can’t believe the amount of one-on-one coaching I’ve gotten from people all over the continent.  This amount and quality of feedback, modelling, and trenchant questioning was not available to me anywhere else.

Thank you for your support, for your critiques, for your suggestions of papers and books and workshops and videos and apps and sundry other ways to fill those hours between 5PM and 7AM.  To those on whose blogs I have hashed out my angst in unreasonably long comments, extra thanks.  I am finally starting to teach in a way that I respect, and I couldn’t have dreamt it, let alone done it, without you.

If you are reading this and wondering if you should start a blog, I would say go for it.  It doesn’t cost anything to try; if it doesn’t work for you, you don’t have to follow through.  But the return on investment is so high that I bet you’ll be hooked.  If you’re still unsure, write to a fairly new blogger (yours truly, for example) and fire some questions at them.  Chances are you’ll find what you need.  I sure did.

I’d love to get rid of my lab books.  They are the standard, canned variety: the instructions are overly helpful and they ask “known-answer questions.”   But I just couldn’t overhaul them this semester — my free time is between 2AM and 7AM.

I did find 2 quick fixes, though, that made canned labs less bad.

1: Assigning the purpose, not the title.

No, seriously — it made a difference when I stopped telling students to “Do Lab 31”, and wrote the purpose of the lab on the skills sheet.  The skill is now “predict the results of a low-pass frequency sweep.  Build a circuit to test your predictions.”  Which is basically what Lab 31 is about.  (I use a consistent wording, which probably helps too.  “Predict (something).  Build a circuit to test your predictions.”)

Some students thumb through the lab book until they find one that suits their purpose.  Some students just make something up.  In both cases, they now know what they’re looking for.

It gives them some choice about the level of guidance they want.  It gives them a backup plan if they get frustrated and don’t know what to do next.  They can use the lab procedure as a recipe, or they can use it for inspiration.  Having that control seems to improve their ability to assess whether they have met the requirement (“test your predictions”).  It also improves their reading comprehension of the lab book (having a purpose makes things make sense… thanks Cris Tovani!).  If that’s all they needed, why didn’t they read the purpose that’s printed in the lab?  I dunno. (Ok, the purpose is often badly written and buries the lead).

Anyway, even those who use the lab procedure word-for-word are now choosing which words to follow.  What I mean is, they evaluate each step in the lab procedure to find out if it’s necessary to meet my requirements.  I say again — they evaluate each step in the lab procedure.

If they decide that they can demonstrate the skill I asked for without doing step 14, then they figure they’ve pulled one over on me.  I never thought I’d be so delighted to see them game the system.

2: Measuring how wrong the theory is, not how right.

“When things are acting funny, measure the amount of funny.” (Bob Pease, National Semiconductor) .

Now that’s a lab purpose I can get behind: find the funny, and measure it.

What if the goal of the inquiry was not to find the right answer (which after all is already known) but to find out how wrong the right answer is?  In other words, let’s discover the extent to which the theory fails.  This is several kinds of useful: the result is no longer known, and it gives you a gut feeling for how much your experimental data should reasonably diverge from predictions.  It lets the students evaluate the model, instead of being evaluated by it.  It also raises the question, “what else is going on that we don’t know about”?

Yesterday we had a lab on band-pass filters.  About half of the class discovered the parasitic capacitance of inductors.  They were excited about this.  I can’t help thinking that this happened partly because the goal was to test their predictions — not to match them.  (Also, because I don’t require them to fill in the pre-printed table in the lab report, they increased the frequency until the signal generator topped out — just to see what would happen).

Yes, it’s important that they understand measurement error, and assess their lab technique against some known results.  But my students often interpret “sources of error” as a shameful failure, which like other shameful failures should be hidden and/or lied about.  I hope I’m not mangling experimental philosophy by challenging students to develop more sophisticated predictions that take into account the effects of common sources of error. You test the theory.  Don’t let it test you.  If your data doesn’t come up how you predicted, that means the prediction is wrong.  Is your model too simple?  Are you measuring something other than what you thought?  In either case, answer those questions.  Stop trying to make reality match theory. Reality is not wrong. Reality is real.

“The most exciting phrase to hear in science, the one that heralds new discoveries, is not Eureka! (I found it!) but rather, ‘hmm… that’s funny…'” (Isaac Asimov, probably apocryphal)

No cool new discoveries will be made as long as “funny” means “wrong.”

Today in class I learned that when you have nothing to say and you are left out of the conversation, you feel bored.

The class started normally enough.  I introduced universal motors by not talking about them.

I asked the class how to reverse a DC shunt motor.  Then I asked them about what doesn’t result in reversing the shunt motor . (No matter which way you wire the DC supply, the motor will only turn one way).  So, if you can hook up the supply either way and still get the same rotation, could you run it off of AC?  The usual suspects had questions or comments, wanting to test their theories.  For once I remembered to shut up.  I asked everyone to spend 7 minutes discussing with a partner their ideas about what kinds of motors could be run off of AC or DC.

They sat in silence for 4 minutes, leafing through the book.  (Answer’s not in the book).

After 4 minutes, I told them the answer wasn’t in the book, and that they would have to develop their inferences based on their own knowledge.  They’re a pretty jaded bunch, so they looked a bit weary.  I asked who had some ideas.  A few hands went up.  I told them that anyone who had some ideas should team up with someone to discuss them.  And that anyone who didn’t have a theory should team up with someone to generate some.  They ruefully sat in pairs and conversation started.

Then it got louder, then it got animated.  I stood back while wheely chairs were spun as people impersonated rotors.  When I heard the conversation veer to a recent hockey controversy, I quietly started making a round of the room.  Conversations got back on track.  I asked groups what their theories were so far, and suggested they consider not only DC motors that could use AC, but also AC motors’ ability to use DC.  No one had considered that, and the conversation got loud again.  At the 7 minute mark, they were just getting warmed up.  I decided to let them continue.  I stood back, consciously deciding to not hover, and realized I wasn’t sure what to do.

When I teach someone how to use a tool, I am adamant about never touching it.  If I need to model how to use that tool, I get my own, but I never remove the tool from a student’s hands.  I have even been known to slap someone’s hand away if they try to remove tools from my students’ hands for any reason less serious than loss of life or limb.  So why on earth do I take the words — tools of ideas and understanding — away from my students?

Today in class I had nothing to say, was ignored and left out of the conversation, and enjoyed a moment of boredom.

Update: the most recent version of my grading policy has its own page, “How I Grade,” on a tab above.

The new assessment and reporting plan is done… for now.  Here’s the status so far.

The Rubric — Pro

If you score some level-3 or level-4 questions, you don’t get credit for them until you’ve finished the level-2 skills.  It doesn’t invalidate the more advanced work you’ve done; you don’t have to necessarily do it all over again — it’s sort of held in the bank, to be cashed in once the level 2 stuff is complete.  It doesn’t penalize those who choose a non-linear path, but it doesn’t let basic skills slip through the cracks.

Choosing Skills — Con

Oh boy, this is definitely ridiculous.  As you can see, there are way too many.  It actually got worse since my first draft, peaking  in version 0.6 and coming back down in the one linked above.  These guidelines helped me beat it back.  I’m telling myself that the level 2 skills will repeat in each topic, and that it won’t end up being 100 items in my gradebook.  On the other hand, this program has 6 semesters-worth of material crammed into 4 semesters-worth of time.  It is like being carpet-bombed with information.  And yet, when our grads get hired, there is always more their employers wish they knew.  The previous grading system didn’t create the problem; this new system will not solve it.  The whole project would be frankly impossible without SBG Gradebook, so bottom-of-my-heart thanks to Shawn Cornally and anyone else involved.

Re-assessment — Pro

Re-assessment can be initiated by me (quizzes) or by the student (by showing me that they’ve done something to improve).  Grades can go down as well as up.  I took to heart the suggestions by many people that one day per week should be chosen for reassessment.  We’re blessed with 3-hour shop periods, which is typically more time than the students need to get a lab exercise done.  So shop period isn’t just for reassessing shop things any more; you can also reassess written things then too.

Synthesis — We’ll see

Some synthesis skills I consider essential, like “determine whether the meter or the scope is the best tool for a given measurement”.  Those are  level-3 skills, with their individual parts included as level-2 skills.  That means you have to do them to pass.  It also means I have to explicitly teach students not only how to use a scope and a meter, but how todetermine“.  Seriously, they don’t know.  Sometimes I weep in despair that it’s possible to graduate from high school, maybe even get a job, work for a few years, have a couple of kids, and still not know how to make a decision strategically.  (Or at least, not be able to call on that skill while you are physically located inside a classroom).  Other days I stop tilting at windmills and start teaching it, helping students recognize situations where they have already done it, and trying to convince them that in-school and everywhere-else are not alternate universes.

Other forms of synthesis are ways of demonstrating excellence but not worth failing someone over; these become level-4 or 5 skills.  It still tells the student where they are strong and where they can improve.  It tells me and their next-semester teachers how much synthesis they’ve done.  That’s all I need.

This directly contradicts my earlier plan to let students “test out” of a skill.  But, because level 2 and level 5 are now different skills, I don’t have to write 5 versions of the question for each skill.  I think that brings the workload (for the students and me) back down to a reasonable level, allowing me to reassess throughout the term.  The quizzes are so cumulative that I don’t think an exam would add anything to the information.

Retention — Too soon to tell

It’s important to me to know how you’re doing today, not last month.  That means I reserve the right to reassess things any time, and your score could very well go down. This is bound up with the structure of the course: AC Circuits has 6 units, each of which builds directly on the previous one (unlike a science class where, for example, unit 1 might be atoms and unit 2 might be aardvarks).  Con: a missed skill back in unit 1 will mess you up over and over.  Pro: provides lots of practise and opportunities to work the same skill from different angles.  With luck, Unit 5 will give you some insight on Unit 2 and allow you to go back and fix it up if needed.

Feedback — Pro, I think

This will be tough, because there’s not enough time.  The concepts in these courses are complex and take a long time to explain well.  The textbook is a good reference for looking up things you already know but not much good at explaining things you don’t know.  That means I talk a lot in class.  At best, I get the students participating in conversations or activities or musical re-enactments (don’t laugh — “Walk like… an ee-lec-tron” is one of my better lesson plans) but it leaves precious little time for practice problems.  I’ll try to assign a couple of problems per night so we can talk about them in class without necessarily doing them in class.

I’ve also folded in extra feedback to this “weekly portfolio” approach I stole from Jason Buell.  Each student has a double-pocket folder for their list of topic skills.  There are a couple of pieces of looseleaf in the brads too.  When they’ve got something they either want feedback on (maybe some especially-troublesome practice problems that we didn’t have time to review in class) or that they want to submit, they can write a note on the looseleaf, slide the documentation into the pocket, and leave it in my mailbox.  I either do or do not agree that it sufficiently demonstrates skills X, Y, and Z, and write them back.  We did a bit of this with a work-record book last semester, and the conversations we had in writing were pretty cool.   I’m looking forward to the “message-board” as our conversation goes back and forth.  I hope to keep the same folders next year, so we can refer back to old conversations.

Archives

I’M READING ABOUT