On Thursday, two students and I will present a workshop on “Exploring Student-Designed Assessment” at the Pan-Canadian Conference on Universal Design for Learning.  If you’ll be at the conference, please join us!

We hope to take UDL’s “multiple means” to a new level: how much of the assessment strategy can students design themselves?  We’ll explore how Standards-Based Grading can be used to turn over that control, by letting students apply for reassessment when they’re ready, as often as they are ready (up to once per week), and in the format that they choose. For a more detailed exploration of how this connects to UDL philosophy, see my previous post.

Why Should Students Design their Own Assessments?

Tim Bargen, one of the students with whom I’ve co-designed this workshop, values the aspect of UDL that prioritizes offering this flexibility to all students, not just those with a diagnosis or documentation.

“The idea of ‘Tight Goals, Loose Means’ is important because [everyone] has the same freedom. When doing things differently, rather than feeling a need to hide it, it’s easier to discuss the way you are doing things with classmates and share ideas. ”

He also values the opportunity to control what he spends more and less time on, and the ability to learn by reassessing skills as often as he chooses. “It allows students with differing prior knowledge to focus on the parts of the content that are difficult for them. It’s simply more efficient to ‘learn the hard way’ [by making mistakes and trying again.] It allows the student to more readily identify the areas they struggle with.”  He also points out that “reassessment” doesn’t just help you fix your mistakes in physics; it can also help you fix your “mistakes” in time management and other life skills.  “Mistakes” could be taken to include those made outside of the classroom, which prevent the student from engaging in or showing up for classes.”

What do we mean by “Student-Designed Assessment,” and how do we do it?

We’ll describe 2 techniques that combine UDL and Standards-Based Grading, give participants some time to try one of those techniques, and then take questions.  You can follow the structure of the workshop using the handout, in

1. Skill Sheets

Description and Example (screencast of handout pp 2-3)

An example of a skill sheet for an AC Circuits course. See handout and screencast for details.

Skill sheets are a tracking tool that students use to figure out what they’ve completed, what they need to work on, and what’s coming up next.

The magic happens when they are used in the context of Standards-Based Grading.  This means that students can choose for themselves when and how to demonstrate their mastery of each skill.

Description (screencast of excerpts from How I Grade)

2. Format-Independent Rubric

Description and Example (Screencast of handout pp 7-11)

If I’m going to encourage students to choose their own format for demonstrating mastery, I have to be ready for anything.  No one’s written a folk song about electrons yet, but I’m looking forward to that day.  In the meantime, I need a rubric that is as format and content independent as possible.

For example, I use a single rubric for anything that students build – whether they submit a report about it, a video about it, or demonstrate it to me in person.  It must include:

  • Predictions of electrical quantities
  • Measurements to test all predictions
  • Comparisons of predictions to measurements
  • Discussion of what happened, what could be causing it, and how it connects to other things the student has learned

This removes the burden of creating a new rubric for every new thing a student decides to do.  In the workshop, we will present example templates and invite participants to design their own.

We look forward to meeting you and exploring these topics together.  See you soon!

Workshop Handout

On Thursday, two students and I will present a workshop on “Exploring Student-Designed Curriculum” at the Pan-Canadian Conference on Universal Design for Learning.  If you’ll be at the conference, please join us!

We hope to take UDL’s “multiple means” to a new level: how much of the curriculum can students design themselves?  Beyond letting students choose how they engage, to what extent can we empower students to choose what they engage with? For a more detailed exploration of how this connects to UDL philosophy, see my previous post.

Why Should Students Design the Curriculum?

Tim Bargen, one of the students with whom I’ve co-designed this workshop, offers a few thoughts.

“I don’t usually have [trouble getting engaged]; usually it’s the opposite, unless I’m depressed.  It’s not that I don’t care; it’s that the lab gave me an idea and now I’m cruising eBay looking for parts for some project, or busy tracking rabbits on Wikipedia.”

That degree of focus has made school itself an obstacle for Bargen, who describes his previous experiences with school as “depressing.”  “When I have trouble getting motivated, it’s that I’m already too far behind” because of time spent on work that can’t be submitted for credit.

“I’ve failed/withdrawn from several university programs, with this downward spiral of decreasing engagement being a major contributor. More recently, I had an instructor who was aware and understanding of this difficulty. I believe that this was at least partially responsible in (somewhat) preventing the downward spiral and decreasing engagement. Obviously, experience, ‘maturity’, medication… all had a part to play here as well, but I still think this had a significant impact. I often have difficulty falling asleep; likely something like Delayed Sleep Phase Disorder. Rather than being awake all night doing unrelated activities,  I often spent that same time interacting with the course material.”

What do we mean by “Student-Designed Curriculum,” and how do we do it?

We’ll describe 3 main techniques from the point of view of the instructor and the students, give participants some time to try one of those techniques, and then take questions.  You can follow the structure of the workshop using the handout, in DOC or PDF format.

1. Question-Generating Exercises

Description and Example (screencast of handout pp 2-3)

A caterpillar named Earl

Earl the caterpillar is a battery powered play-dough-creation with glowing spots and a spinning tail. He was a fruitful question-generating exercise.

We work through question-generating exercises at the beginning of the year, and throughout the year.  A good exercise is one that

  • can be explored independently by students with rudimentary skills (“low floor”)
  • can incorporate knowledge and experience of students with prior exposure (“high ceiling”)
  • allows students to generate questions about the course topics at their own level of complexity

In our field, a good example is “Squishy Circuits” — where students make working circuits out of playdough. In the workshop, participants will have a change to explore Question-Generating Exercises in their own domain.

2. Comprehension Constructor

Description and Example (Screencast of handout pp 6-10)

As the instructor, I assess all student work according to the same criteria, which are format-independent. My criteria are:

  • Do you have two convergent pieces of evidence that back up your point?
  • Are they clear enough to you that you can summarize them?
  • Has at least one of them been reviewed by an expert?
  • Can you connect it to your own experience, in or out of school?
  • Can you share how you visualize or otherwise imagine it?
  • Can you answer the question “how much?”
  • Can you answer the question “what causes it?”
  • Is it coherent with other things you have learned?

Your criteria will be different; the point is to have some, and use them consistently; Cris Tovani calls this a Comprehension Constructor.  This removes the burden of creating a new rubric for every new thing a student decides to do.  In the workshop, we will present example templates and invite participants to design their own.

3. Question-Tracking Spreadsheet

Description and Example (screencast of handout pp. 13-14)

Any question that comes up, either as part of an assignment, or during class discussion, gets added to a tracking spreadsheet.  When students are ready for a new topic, they choose from the spreadsheet.  Of course, it falls to the instructor to decide whether to offer the entire list to choose from, or to triage the questions according to which topics are required for the course and which are optional. I might also filter them according to whether they lend themselves to experiment or research, and break large questions down into smaller parts.

The spreadsheet also allows me to keep track of how well various activities do at generating questions, and on what topics.  That helps me design good Question-Generating Exercises for the beginning of the next year.  I will demonstrate the spreadsheet I use, and invite participants to consider how they might adapt it in their work.

We look forward to meeting you and exploring these topics together.  See you soon!

Workshop Handout


Miniature Guide to Critical Thinking

How I Chose my Criteria for Critical Thinking

Discovery Learning and Teaching, David Hammer

My review of Do I Really Have to Teach Reading, by Cris Tovani

I’ve had 2 workshop proposals accepted at the second Pan-Canadian Conference on Universal Design for Learning (UDL). I’m excited to be co-designing and co-presenting with two students who are extremely knowledgeable about disability rights and disability-based accommodations in educational institutions.  I’m going to use this post to brainstorm two workshops:

  • Student-Designed Curriculum
  • Student-Designed Assessment

Comic showing a worker shovelling stairs. He says: "All these other kids are waiting to use the stairs. When I get through shoveling them off, then I will clear the ramp for you." A wheelchair user replies, "But if you shovel the ramp, we can all get in!"If you’re not familiar with UDL, it emerged in the 90s, inspired by universal design in architecture.  In the architectural world, the goal is to design systems that are as broadly usable as possible, rather than creating “alternatives” to systems that create barriers for some users.  The classic example is about a ramp vs. stairs.  If you build stairs, you’ll need a ramp or an elevator or some other method for wheelchair users to bypass the stairs; but if you build a ramp, both walkers and wheelers can use it (along with those pushing strollers, hauling carts, using crutches, etc.).

Like any educational philosophy, people use it in lots of ways to mean lots of things, some of which contradict each other. One of the main proponents is the National Centre on Universal Design for Learning, which publishes guidelines recommending

  • Multiple means of engagement (i.e. many possible answers to “why am I learning this”)
  • Multiple means of representation (i.e. many possible ways I can access information)
  • Multiple means of action and expression (i.e. many possible ways to show what I know and can do)

I’m simplifying here, and the guidelines have evolved to include more complex ideas about executive function, self-regulation, etc.  You can see a comparison of the three version of the guidelines as they have evolved. It looks to me like the vocabulary has gotten more complex and the order has changed, but the ideas mostly have not.  I think they intend to shift away from assuming that learners need changing, to assuming that curriculum needs to be more accessible… I’m not totally sold that this infographic does justice to that idea.  But that’s ok.  A lot of things have gotten lumped under this umbrella, and I’m interested in a specific subset: creating learning environments where students have as much control as possible.

UDL Vs. Differentiation

Differentiated Instruction vs. Universal Design for LearningThis, for me, is what distinguishes UDL from a “differentiation” approach. Differentiation often focuses on being responsive to student difficulties caused by inaccessible materials, like a print handout; teachers have to create alternate materials (maybe providing an electronic copy) tailored to that student, and someone (probably the student) has to justify the increased work by submitting a documented diagnosis, or everyone would want it…

UDL focuses on removing barriers in the first place.  The vision that excites me is of choosing the most flexible options that inherently afford students to differentiate for themselves.  To give a slightly trivial example, if I provide electronic copies of everything to everyone, those who want print copies can have them; those who use screen readers can use them; those who use a tablet to magnify the document can do that.   That doesn’t mean the instructor no longer needs pay attention; students will still experience barriers that we haven’t anticipated or that aren’t in our power to change (in this example, I’ll have to notice who has high-speed internet access at home and who doesn’t; who has a tablet; etc).  But there should be fewer of them. And it should no longer depend on “proving” that you’re “needy” enough to “deserve” accommodations.  Unlike differentiation, UDL is already there before you show up, before you ask for it.  It becomes a flexibility that benefits everyone.

Standards-based grading is of course a part of that.  I  allow students to decide when they will reassess (within a certain window), in what format, and with what specific example.  They are welcome to write a quiz, but just as welcome to submit a paper, a screencast, a blog post, or a video; the subject can be an experiment, some research, an interview, etc., as long as it demonstrates their mastery of the skill in question.

Loose Means, Tight Goals… But How to Choose the Goals?

To realize the promise of UDL, I have to choose the skills with extreme care, and disaggregate them as much as possible.  Because I teach electronics circuits courses, the skills I have chosen are things like “interpret voltmeter measurements”, and “analyze a capacitor circuit”.  I have carefully removed the format from the skill; it’s not “write a lab report about a capacitor circuit.”  Should my students be required to know how to write a lab report?  Maybe.  But it’s not an outcome in my course, so it’s not a requirement in my assessment.

Can Students Decide What we Study?

I have also experimented with ways to allow students to design the curriculum.  I sometimes call this “emergent curriculum”, since it emerges from interests the students identify. But instructors often use that term to mean that the instructor inquires into the students’ learning and interests, then designs the curriculum accordingly.  I’m interested in pushing the locus of control as far toward the students as makes sense.

At the beginning of the year, I run a bunch of activities to find out what students know, what they wonder, and what they want to learn.  We make play-dough circuits, while they record and pass in a log sheet of ideas they had during the activity.  Another activity is a research prompt: “learn something about atoms you didn’t know before.”  I encourage them to record their questions as well as ideas.  All the question go into a question bank, where I tag them according to student, topic, whether they require research or experimentation, etc.  You can see a sample below; click through to make it bigger.

Spreadsheet sample showing questions students have asked

In the next class, I bring the list of questions; everyone’s assignment is to pick one and research it.  When we head to the shop for our lab period, I bring the same list of questions but filtered for testable questions; everyone’s job is to test one.  Or make up a new one.  I’ll ask them to run it by me, but I’ve never vetoed one; at most, I might insist on special safety precautions, or if the question is going to take all month to test, I might ask them to tackle a small piece of it.

Since the standard they are trying to meet is “interpret voltmeter measurements”, it really doesn’t matter what they measure.  Later standards in the same course are about Ohm’s Law, the voltage divider rule, etc, and it might seem that, at some point, they’d have to stop playing around and turn to Chapter 3.  But that’s the beauty of it. It doesn’t matter what you measure.  Ohm’s law will be there.  That’s why it’s called a law.  In the humanities, maybe you would say “it doesn’t matter what you read, it will have a metaphor in it.” Or “it doesn’t matter what you listen to — it will have chord progressions and cadences.”

Obviously, I triage the questions.  That goes back to the skills I’ve chosen; some of them students must master in order to pass, and some of them are optional extras.  I split the questions up accordingly, and ask people to choose from the “required” section before they choose from the “optional” section.  You can also imagine that there’s some direct instruction going on too — specific instructions on how to use a voltmeter, and required safety precautions.  Those become our class “Best Practices.”  After that, they can measure anything they want, as long as they do it “in accordance with best practices.”

One they’ve generated data, either research or measurement, on their chosen topics, I photocopy a class set of everyone’s results, they break into groups, and see what conclusions they can draw.  That brings up more questions, and off we go again.

In this system, it doesn’t matter what experiments they run, so long as they are about circuits.  I constrain the domain by providing batteries and lightbulbs (DC sources and resistive loads), so that we don’t end up trying to deal with topics from next semester.  Although, sometimes those come up, which is great.  The students can “interpret voltmeter measurements” about that as well.  You wouldn’t believe the extreme examples of heating effects this year’s students discovered!  Never seen anything like it.  And it didn’t matter.

My job does not spiral out of control trying to prepare lessons about every wacky topic they come up with.  My job is to provide a limited set of materials; if a question isn’t testable with those materials, they’ll pick another one.  My job is to photocopy their results; doesn’t matter what the results are about.  My job is to ask questions during their “peer review” sessions, to make sure they notice contradictions.  No matter what they investigate, my job doesn’t change, and the workload doesn’t get any heavier.

Why Present This at a UDL Conference?

I approached two students who are knowledgeable about disability accomodations in educational systems, and they agreed to work with me to design the workshops.  We had a wide-ranging brainstorming session, and these are the topics they thought were most important.

UDL talks about providing multiple ways for students to get motivated.  Rather than “providing” a limited set of motivations designed by me, I’m interested in supporting the infinite range of motivations students already have, based on their pre-existing interests and experience, by letting them build on any topic that relates to the course.

UDL talks about providing multiple ways for students to express their learning.  I’m interested in the infinite variety of ways they will come up with.   Because we have classroom Best Practices about measuring well and thinking well, I don’t have to make up a separate rubric for every different format a student wants to use. The same rubric applies to everything: safety, clarity, precision, causality, coherence with the evidence.


My workshop proposals are below.

  • If you only had 50 minutes to make sense of these ideas, what would you highlight?
  • What do I need to pay attention to so this doesn’t come across as only being useful if you teach science or engineering?
  • How can the format of the workshop itself be informed by UDL?
  • What does this make you wonder about?

Student-Designed Curriculum, With Rigour and Without Instructor Burnout

Can a group of students collaboratively design their own curriculum?  We say yes. One community college instructor and two students, both registered with Disability Services, will present techniques we have worked on together for four semesters. These include activities and record-keeping systems an instructor can use to map student interests and turn significant control of course content over to students.  There will be time for participants to create their own mapping format, or adapt an activity to increase its ability to help students design their path of inquiry.

UDL often focuses on the instructor’s ability to “provide” multiple means of representation, engagement, and expression.  We propose to frame our conversation around students’ power to “determine” those means, and then choose among them.  This vision of UDL puts real control over both format and content of a course into the hands of students.  It means that both instructors and students work explicitly to discover and value students’ pre-existing knowledge and outside-of-class experience, and dove-tails with practices of culturally responsive pedagogy.  We will discuss our experiences of working with class groups who are taking on this responsibility, and share techniques that increase the accessibility of this practice for both students and teachers. This includes design of classroom activities, record-keeping systems for large amounts of unstructured student data, and how to do this even in institutions with conventional expectations about course outlines, etc.  Our work is partly informed by David Hammer’s ideas of “Discovery Learning and Discovery Teaching” (Cognition and Instruction, Vol 15, No. 4, 1997).  We will invite participants to explore where and how these techniques could work in their courses.

Relevant UDL guidelines:

  • Promote expectations and beliefs that optimize motivation
  • Heighten salience of goals and objectives
  • Vary demands and resources to optimize challenge
  • Foster collaboration and community
  • Optimize individual choice and autonomy
  • Optimize relevance, value, and authenticity
  • Activate or supply background knowledge
  • Offer ways of customizing the display of information
  • Offer alternatives for auditory information
  • Offer alternatives for visual information
  • Guide appropriate goal-setting
  • Use multiple media for communication
  • Use multiple tools for construction and composition
  • Vary the methods for response and navigation

How Student-Designed Assessment Can Make UDL Easier for Students and Teachers

Standards-based grading is an assessment system focused on self-assessment and strategic improvement. It encourages everyone to make low-stakes mistakes while experimenting with many formats, and to learn from these mistakes about the content and about themselves. As a team of two community college students and one instructor who have worked together for four semesters, we will describe how this system can increase both rigour and accessibility. We will also provide participants with templates to use in experimenting with SBG in their own courses.

Two students and one instructor, all of whom have struggled with and at times left post-secondary institutions, come together to discuss assessment techniques that transform our experience of formal education.  Standards-based grading (SBG) is not one single system; it is a philosophy that can encompass a variety of strategies, including student-controlled due dates, recognition of prior learning, and most importantly student-designed assessments. We describe how we use SBG to create the greatest possible freedom for ourselves and others in our classroom community.  We also discuss when self-advocacy can decrease accessibility, and what to do instead.  We will invite participants to evaluate their course outcomes, experiment with writing new ones using a rubric for SBG and UDL, and test their choices against imagined alternative assessments.

Relevant UDL Guidelines:

  • Facilitate personal coping skills and strategies
  • Develop self-assessment and reflection
  • Increase mastery-oriented feedback
  • Optimize individual choice and autonomy
  • Optimize relevance, value, and authenticity
  • Minimize threats and distractions
  • Offer ways of customizing the display of information
  • Guide appropriate goal-setting
  • Support planning and strategy development
  • Enhance capacity for monitoring progress
  • Build fluencies with graduated levels of support for practice and performance
  • Vary the methods for response and navigation

Early Warning Signs of Fascism

A local media outlet recently wrote

“Why the constant, often blatant lying? For one thing, it functioned as a means of fully dominating subordinates, who would have to cast aside all their integrity to repeat outrageous falsehoods and would then be bound to the leader by shame and complicity. “The great analysts of truth and language in politics” — writes McGill University political philosophy professor Jacob T. Levy — including “George Orwell, Hannah Arendt, Vaclav Havel — can help us recognize this kind of lie for what it is…. Saying something obviously untrue, and making your subordinates repeat it with a straight face in their own voice, is a particularly startling display of power over them. It’s something that was endemic to totalitarianism.”

How often does this happen in our classrooms?  How often do we require students to memorize and repeat things they actually think are nonsense?  

  • “Heavy things fall at the same speed as light things.” (Sure, whatever.)
  • “An object in motion will stay in motion forever unless something stops it.” (That’s ridiculous.  Everyone knows that everything stops eventually.  Even planets’ orbits degrade.).
  • When you burn propane, water comes out. (Pul-lease.)
  • The answer to “in January of the year 2000, I was one more than eleven times as old as my son William while in January of 2009, I was seven more than three times as old as him” is somehow not, “why do you not know the age of your own kid?

Real conversation I had with a class a few years ago:

Me: what do you think so far about how weight affects the speed that things fall?

Students (intoning): “Everything falls at the same speed.”

Me: So, do you think that’s weird?

Students: No.

Me: But, this book… I can feel the heaviness in my hand.  And this pencil, I can barely feel it at all.  It feels like the book is pulling harder downward on my hand than the pencil is.  Why wouldn’t that affect the speed of the fall?”

Student: “It’s not actually pulling harder.  It just feels that way, but that’s weight, not mass.”

Me: (weeps quietly)

Please don’t lecture me about the physics.  I’m aware.  Please also don’t lecture me about the terrible fake-Socratic-teaching I’m doing in that example dialogue.  I’m aware of that too.  I’m just saying that students often perceive these to contradict their lived experience, and research shows that outside of classrooms, even those who said the right things on the test usually go right back to thinking what they thought before.

And no, I’m not comparing the role of teachers to the role of Presidents or Prime Ministers.  I do realize they’re different.

Should I Conclude Any of These Things?

  1. Students’ ability to fail to retain or synthesize things that don’t make sense to them is actually a healthful and critically needed form of resistance.
  2. When teachers complain about students and “just memorizing what they need for the test and forgetting it after, without trying to really digest the material,” what we are complaining about is their fascism-prevention mechanism
  3. Teachers have the opportunity to be the “warm up,” the “opening act” — the small-scale practice ground where young minds practice repeating things they don’t believe, thinking they can safely forget them later.
  4. Teachers have the opportunity to be the “innoculation” — the small-scale practice ground where young minds can practice “honoring their dissatisfaction” in a way that, if they get confident with it, might have a chance at saving their integrity, their souls, and their democracy.

Extension Problem

Applying this train of thought to the conventional ways of doing corporate diversity training is left as an exercise for the reader.


diagram of axon terminal and dendrite

It’s that time of year again — the incoming first-year students worked through How Your Brain Learns and Remembers.  Some student comments I don’t want to lose track of:

“Of course you can grow your intelligence.  How else am I not still at the mental capacity of a newborn?”

“Where do dendrites grow to?  Why?  How do they know where to grow to?”

“I think I can grow my intelligence with lots of practice and a calm open mind!”

“Though I feel intelligence can grow, I feel it can’t grow by much.  Intelligence is your ability to learn, and you can’t just change it on the spot.”

“I think I can grow my intelligence because the older I get the smarter I get and I learn from mistakes.”

“I can definitely become more intelligent or more knowledgeable, otherwise no point in trying to learn. However, everyone is different, so each person could take more time to grow the dendrites and form the proper memories.”

“I think you can grow your intelligence through practice.”

“Do some people’s dendrites build quicker/slower and stronger/weaker than others?  Do some people’s break down quicker or slower than others?”

“You should be keeping up through the week but you probably can do homework only on weekends if you really focus.”

“People as a whole are always able to learn.  That’s what makes us the dominant species. It’s never too late to teach an old dog a new trick.”

Did you know robots can help us develop growth mindset?  It’s true.  Machine learning means that not only can robots learn, they can teach us too.  To see how, check out this post on Byrdseed.  I have no idea why watching videos of robots making mistakes is so funny, but my students and I were all in helpless hysterics after the first minute of this one…

After a quick discussion to refresh our memories about growth mindset and fixed mindset (which I introduced in the fall using this activity), I followed Ian’s suggestion to have the students write letters to the robot.  One from each mindset.  I collated them into two letters (shown below), which I will bring back to the students tomorrow.  All of this feeds into a major activity about writing good quality feedback, and the regular weekly practise of students writing feedback to themselves on their quizzes.

I didn’t show the second minute of the video until after everyone had turned in their letters.  But I like Ian’s suggestion of doing that later in the week and writing two new letters… where the fixed mindset has to take it all back.

Fixed Mindset

Growth Mindset

Dear robot, try not flipping pancakes.  Just stop, you suck.  Why don’t you find a better robot to do it for you? You are getting worse.  There is no chance for improvement.  Give up, just reprogram yourself, you’ll hurt someone. Perhaps you weren’t mean to flip pancakes. Try something else.  Maybe discus throwing. Dear robot, please keep trying to flip the pancake. At least it left the pan on attempt 20.  Go take a nap and try again tomorrow.  Practise more. Don’t feel bad, I can’t flip pancakes.  Keep working, and think of what can help.  I see that you’re trying different new techniques and that’s making you get closer. Maybe try another approach.  Would having another example help? Is there someone who could give you some constructive feedback? Or maybe have a way to see the pancake, like a motion capture system. That would help you keep track of the pancake as it moves through the air. Keep going, I believe in you!




Siobhan Curious inspired me to organize my thoughts so far about meta-cognition with her post “What Do Students Need to Learn About Learning.” Anyone want to suggest alternatives, additions, or improvements?

Time Management

One thing I’ve tried is to allow students to extend their due dates at will — for any reason or no reason.  The only condition is that they notify me before the end of the business day *before* the due date.  This removes the motivation to inflate or fabricate reasons — since they don’t need one.  It also promotes time management in two ways: one, it means students have to think one day ahead about what’s due.  If they start an assignment the night before it’s due and realize they can’t finish it for some reason, the extension is not available; so they get into the habit of starting things at least two days before the due date.  It’s a small improvement, but I figure it’s the logical first baby step!

The other way it promotes time management is that every student’s due dates end up being different, so they have to start keeping their own calendar — they can’t just ask a classmate, since everyone’s got custom due dates.  I can nag about the usefulness of using a calendar until the cows come home, but this provides a concrete motivation to do it.  This year I realized that my students, most of them of the generation that people complain is “always on their phones”, don’t know how to use their calendar app.  I’m thinking of incorporating this next semester — especially showing them how to keep separate “school” and “personal” calendars so they can be displayed together or individually, and also why it’s useful to track both the dates work is due, in addition to the block of time when they actually plan to work on it.

Relating Ideas To Promote Retention

My best attempt at this has been to require it on tests and assignments: “give one example of an idea we’ve learned previously that supports this one,” or “give two examples of evidence from the data sheet that support your answer.”  I accept almost any answers here, unless they’re completely unrelated to the topic, and the students’ choices help me understand how they’re thinking.

Organizing Their Notes

Two things I’ve tried are handing out dividers at the beginning of the semester, one per topic… and creating activities that require students to use data from previous weeks or months.  I try to start this immediately at the beginning of the semester, so they get in the habit of keeping things in their binders, instead of tossing them in the bottom of a locker or backpack.  The latter seems to work better than the former… although I’d like to be more intentional about helping them “file” assignments and tests in the right section of their binders when they get passed back.  This also (I hope) helps them develop methodical ways of searching through their notes for information, which I think many students are unfamiliar with because they are so used to being able to press CTRL -F.  Open-notes tests also help motivate this.

I also explicitly teach how and when to use the textbook’s table of contents vs index, and give assignments where they have to look up information in the text (or find a practise problem on a given topic), which is surprisingly hard for my first year college students!

Dealing With Failure

Interestingly, I have students who have so little experience with it that they’re not skilled in dealing with it, and others who have experienced failure so consistently that they seem to have given up even trying to deal with it.  It’s hard to help both groups at the same time.  I’m experimenting with two main activities here: the Marshmallow Challenge and How Your Brain Learns and Remembers (based on ideas similar to Carol Dweck’s “growth mindset”).

Absolute Vs Analytical Ways of Knowing

I use the Foundation for Critical Thinking’s “Miniature Guide To Critical Thinking.”  It’s short, I can afford to buy a class set, and it’s surprisingly useful.  I introduce the pieces one at a time, as they become relevant.  See p. 18 for the idea of “multi-system thinking”; it’s their way of pointing out that the distinction between “opinions” and “facts” doesn’t go far enough, because most substantive questions require us to go beyond right and wrong answers into making a well-reasoned judgment call about better and worse answers — which is different from an entirely subjective and personal opinion about preference.  I also appreciate their idea that “critical thinking” means “using criteria”, not just “criticizing.”  And when class discussions get heated or confrontational, nothing helps me keep myself and my students focused better than their “intellectual traits” (p. 16 of the little booklet, or also available online here) (my struggles, failures, and successes are somewhat documented Evaluating Thinking).

What the Mind Does While Reading

This is one of my major obsessions.  So far the most useful resources I have found are books by Chris Tovani, especially Do I Really Have to Teach Reading? and I Read It But I Don’t Get It.  Tovani is a teacher educator who describes herself as having been functionally illiterate for most of her school years.  Both books are full of concrete lesson ideas and handouts that can be photocopied.  I created some handouts that are available for others to download based on her exercises — such as the Pencil Test and the “Think-Aloud.”

Ideas About Ideas

While attempting these things, I’ve gradually learned that many of the concepts and vocabulary items about evaluating ideas are foreign to my students.  Many students don’t know words like “inference”, “definition”, “contradiction” (yes, I’m serious), or my favourite, “begging the question.”  So I’ve tried to weave these into everything we do, especially by using another Tovani-inspired technique — the “Comprehension Constructor.”  The blank handout is below, for anyone who’d like to borrow it or improve it.

To see some examples of the kinds of things students write when they do it, click through:


The first-year students are solving series circuits and explaining what’s happening.  Most are able to connect their answers and thoughts to evidence we’ve gathered this semester.  But most are struggling with the questions about causality.

For each effect they describe mathematically, I ask them to explain what is physically capable of causing that effect. Or, they can choose to explain why the result seems like it can’t be happening.   It doesn’t have to be cannonical, but it must be internally consistent, not circular, and supported by our evidence.  They are struggling most with explaining Kirchhoff’s Voltage Law.  This is understandable — I don’t think I could explain it heuristically either.  However, only one student took the opportunity to say why it doesn’t make sense.

We’ve done lots of practise writing cause statements.  They know what “begging the question” means.  I’ve modelled, and we’ve practised, the importance of saying “I don’t know” when that’s the most accurate thing we can say. Examples of student thinking are below.

I’m tempted to propose a taxonomy of acausal strategies.  Which examples of student thinking do you think fit where?  Would you add or remove categories?   Could you propose some pithy names for them?

  1. It does that because it’s designed to do that
  2. It does that because if it didn’t, this other important thing wouldn’t happen
  3. It does that because there’s a law that says it has to do that
  4. It does that because it does that (begging the question)
  5. It does that just because

My questions are:

“An electron has to use up all its energy that it gets from the battery.  This is caused because if all of the energy wasn’t used, the circuit wouldn’t give accurate results, or work properly.”

“When electrons pass through a component, that causes them to lose energy.  The electrons would have to be able to flow through the circuit in order to keep the current and battery functioning.”

“An electron has to use up all the energy it gets from the battery.  This is caused because if the voltage from the power source is 5V, the electrons have to use up all of their energy, in this case they use up all of it in the resistor (except for the little energy used in the switch).”

“The electrons always use up exactly the energy they gain in the battery because of conservation of energy.”

“It doesn’t make sense that if there’s only one component in the circuit, it always uses up exactly the battery’s voltage.  A higher resistor should be like a steeper hill — harder for the electrons to get past, and requiring more energy.”

I wrote last month about new approaches I’m using to find out what students think, keep track of who thinks what, and let the curriculum be guided by student curiosity. When Dan Meyer reblogged it recently, an interesting conversation started in the comments on that site.  The question seems to be, “how is this different from common practises?”  It sparked my thinking, so I thought I’d continue here.  If you’re a new reader, welcome.

Just Formative Assessment?

It may be helpful to know that I’m teaching a community college course on the basics of electricity.  The students come in brimming with questions, assumptions, and ideas about how electricity works in their lives — phone chargers, car batteries, electric fences, solar panels.  And all new knowledge gets judged immediately in that court of everyday life. What I’m trying to do better is to discover students’ pre-existing ideas and questions, especially the ones I wouldn’t have anticipated.

I agree that there is a way in which this is nothing new; in a way, it’s the definition of formative assessment.

Many formative assessments inquire into students’ thinking as a T/F question: did they get it, yes or no?  Others ask the question as if it’s multiple choice: are their ideas about motion Aristotelian, Newtonian, or something else? (See Hestenes’ work leading to the Force Concept Inventory).  Some assessments focus on misconceptions: which of these mistaken ways of thinking are causing their problems?  Typically there is some instruction or exercise or activity, and then we try to find out what they got out of it.  Or maybe it’s a pre-assessment, and we use the information to address and correct misconceptions.

I’m trying to shift to essay questions:  Not “Do they think correctly” but “What do they think?”  I’m trying to shift it to a different domain: not “what do they think about how this topic was just taught in this class” but “what have they ever thought about this topic, in all the parts of their lives, and how can we weave them together?”  I also hope to ask it for a different reason: not just, “which parts of their ideas are correct”  but also “which parts of their pre-existing ideas are most likely to lead to insight or perplexity?

As Dan points out, there is a “part 2”: This isn’t just about shifting what I do (keep a spreadsheet where I record student ideas and questions, tagged by topic and activity they were working on when they asked it).  It’s also about shifting my self-assessment.  The best activities aren’t just the ones that help students solve problems; the best assessments yield the most honest student thinking.

Which of the activities in your curriculum would you rank highest on that scale?

What do you think makes them work?

Pros: Student Honesty and Motivation

This year, I’ve got a better handle not only on who holds which ideas,  which ideas are half-digested, applied inconsistently or in a self-contradictory way, and what the students are curious about.

The flashlights you shake to charge — do they work like how friction can transfer electrons from a cat’s fur to a glass rod?

What happens if you try to charge a battery but the volts are lower than the battery you’re trying to charge?

Batteries’ don’t get heavier when you charge them — that is evidence that electrons don’t weigh much.

For example, if I was looking for a way into the superposition theorem, I couldn’t ask for better than this.

Cons: Fear and Conflict

I’ve written extensively about the fear, anger, conflict, and defensiveness that come to the surface when I encourage students to build a practise of constant re-evaluation, rather than certainty.   What are your suggestions for helping students re-evaluate things when they’re sure they already know it? What are your suggestions for helping students notice when common sense pre-conceptions and new ideas aren’t talking to each other?

Bonus points: what are your suggestions for helping teachers re-evaluate things when we’re sure we already know it?  What about for helping teachers notice when our common sense pre-conceptions and new ideas aren’t talking to each other?

Why Am I Obsessed With This?

This is the fear that keeps me awake at night:

The students in the first example had learned in class not to discuss certain aspects of their own ideas or models. In particular, they had learned not to talk about “What things are like?” …

The students in my second and third examples had learned that their ideas were worthless (and confusing to think about).

The problem with (some) guided inquiry like this is the illusion of learning. Instructors doing these kinds of “check outs” can convince themselves that students are building powerful scientific models, but really students are just learning not to share any ideas that might be wrong, not to have conversations that they aren’t supposed to have, and to hide interesting questions and insights that are outside the bounds of the “guided curriculum”.

At the end of the day, if students are learning to avoid taking intellectual risks around the instructor, that instructor doesn’t stand a chance of helping those students learn.

(Read the whole thing from Brian Frank)

Which kinds of assessments do you think discourage students from taking intellectual risks around the instructor?  My gut feeling is that anything along the lines of “elicit-confront-resolve” is a major contributor, but I hope that having more data to look at will help me confirm this.

Pros: I Get Honest and Motivated Too

To be clear, I’m not suggesting that no one else has ever done this.  It’s common to ask students “how were you thinking it through”, such as when discussing a mistake they made on a test.

I don’t want to just do it, though.  I want to do it better than I did last year.  I want to systematically keep track of student ideas and, together with the students, use those ideas to co-create the curriculum. Even the wrong ideas.  Especially the wrong ideas.  I want them to see what’s good in their well-thought out, evidence-based “wrong” answers, and see what’s weak about poorly thought out, unsubstantiated “right” answers.  I want them to do the same for the ideas of their classmates, especially the ideas they don’t share.

It means that sometimes we go learn about a different topic.  If they’re generating curiosity and insight about parallel circuits, I’m not going to force them to shift to series circuits.  It wastes momentum (not to mention goodwill… or what you might call “engagement” or “motivation”).  They know what the goal of the course is; they’ve paid good money and invested their time in reaching that goal.  We come up with a plan together of what it makes sense to learn about next, so that we move closer to the goal.

Want to help me improve?  Here’s the help I could really use.   If you were one of the people whose first reaction to my original post was “I already know that” — either I already know that to be true, or I already know that to be false… what would have helped you respond with curiosity and perplexity, adding your idea as a valuable one of many?  If that was your response, what made it work?

I’ve been looking for new ways every year to turn over a bit more control to the students, to help them use that control well, and to strike a balance between my responsibility to their safety (in their schoolwork and their future jobs) with my responsibility to their personal and collective self-determination.

One tiny change I made this year is to use more “portfolio-style” assessments.  If you work for the same institution I do, you know that “portfolio” can mean a bewildering variety of things… I’m using it here in the concrete sense used by artists and architects.  So far this semester, that looks like doing in-class exercises where students work on 3-5 examples of the same thing. For example, our first lab about circuits required students to hook up 3 circuits, using batteries, light bulbs, and switches, and draw what they had built.  On the second lab day, I asked them to build the same circuits again, based on their sketches, and add measurements of voltage, current, and resistance.  On the third day, they practised interpreting the results, using sentence prompts.

But the “assignment” wasn’t “hook up a circuit.”  The skills I was assessing were “Interpret ohmmeter result”, “Interpret voltmeter results”, “Document a circuit”, etc.  So I asked them to choose from among the circuits they had worked on, and let me know which one (or two) best showed their abilities.

I haven’t reviewed the submissions yet, but I’m anticipating that they’ll need feedback not only on the skill of interpreting a circuit but also on the skill of self-assessment.

In support of this, I’ve had students evaluate the data gathered by the entire class.  Part of my hope is that seeing each other’s work and noticing what makes it easier or harder to make sense of will help them better assess their own work.  What suggestions do you have for helping students get better at choosing which of their work best demonstrates their skills?