You are currently browsing the category archive for the ‘Professional Development’ category.

On Thursday, two students and I will present a workshop on “Exploring Student-Designed Assessment” at the Pan-Canadian Conference on Universal Design for Learning.  If you’ll be at the conference, please join us!

We hope to take UDL’s “multiple means” to a new level: how much of the assessment strategy can students design themselves?  We’ll explore how Standards-Based Grading can be used to turn over that control, by letting students apply for reassessment when they’re ready, as often as they are ready (up to once per week), and in the format that they choose. For a more detailed exploration of how this connects to UDL philosophy, see my previous post.

Why Should Students Design their Own Assessments?

Tim Bargen, one of the students with whom I’ve co-designed this workshop, values the aspect of UDL that prioritizes offering this flexibility to all students, not just those with a diagnosis or documentation.

“The idea of ‘Tight Goals, Loose Means’ is important because [everyone] has the same freedom. When doing things differently, rather than feeling a need to hide it, it’s easier to discuss the way you are doing things with classmates and share ideas. ”

He also values the opportunity to control what he spends more and less time on, and the ability to learn by reassessing skills as often as he chooses. “It allows students with differing prior knowledge to focus on the parts of the content that are difficult for them. It’s simply more efficient to ‘learn the hard way’ [by making mistakes and trying again.] It allows the student to more readily identify the areas they struggle with.”  He also points out that “reassessment” doesn’t just help you fix your mistakes in physics; it can also help you fix your “mistakes” in time management and other life skills.  “Mistakes” could be taken to include those made outside of the classroom, which prevent the student from engaging in or showing up for classes.”

What do we mean by “Student-Designed Assessment,” and how do we do it?

We’ll describe 2 techniques that combine UDL and Standards-Based Grading, give participants some time to try one of those techniques, and then take questions.  You can follow the structure of the workshop using the handout, in

1. Skill Sheets

Description and Example (screencast of handout pp 2-3)

An example of a skill sheet for an AC Circuits course. See handout and screencast for details.

Skill sheets are a tracking tool that students use to figure out what they’ve completed, what they need to work on, and what’s coming up next.

The magic happens when they are used in the context of Standards-Based Grading.  This means that students can choose for themselves when and how to demonstrate their mastery of each skill.

Description (screencast of excerpts from How I Grade)

2. Format-Independent Rubric

Description and Example (Screencast of handout pp 7-11)

If I’m going to encourage students to choose their own format for demonstrating mastery, I have to be ready for anything.  No one’s written a folk song about electrons yet, but I’m looking forward to that day.  In the meantime, I need a rubric that is as format and content independent as possible.

For example, I use a single rubric for anything that students build – whether they submit a report about it, a video about it, or demonstrate it to me in person.  It must include:

  • Predictions of electrical quantities
  • Measurements to test all predictions
  • Comparisons of predictions to measurements
  • Discussion of what happened, what could be causing it, and how it connects to other things the student has learned

This removes the burden of creating a new rubric for every new thing a student decides to do.  In the workshop, we will present example templates and invite participants to design their own.

We look forward to meeting you and exploring these topics together.  See you soon!

Workshop Handout

On Thursday, two students and I will present a workshop on “Exploring Student-Designed Curriculum” at the Pan-Canadian Conference on Universal Design for Learning.  If you’ll be at the conference, please join us!

We hope to take UDL’s “multiple means” to a new level: how much of the curriculum can students design themselves?  Beyond letting students choose how they engage, to what extent can we empower students to choose what they engage with? For a more detailed exploration of how this connects to UDL philosophy, see my previous post.

Why Should Students Design the Curriculum?

Tim Bargen, one of the students with whom I’ve co-designed this workshop, offers a few thoughts.

“I don’t usually have [trouble getting engaged]; usually it’s the opposite, unless I’m depressed.  It’s not that I don’t care; it’s that the lab gave me an idea and now I’m cruising eBay looking for parts for some project, or busy tracking rabbits on Wikipedia.”

That degree of focus has made school itself an obstacle for Bargen, who describes his previous experiences with school as “depressing.”  “When I have trouble getting motivated, it’s that I’m already too far behind” because of time spent on work that can’t be submitted for credit.

“I’ve failed/withdrawn from several university programs, with this downward spiral of decreasing engagement being a major contributor. More recently, I had an instructor who was aware and understanding of this difficulty. I believe that this was at least partially responsible in (somewhat) preventing the downward spiral and decreasing engagement. Obviously, experience, ‘maturity’, medication… all had a part to play here as well, but I still think this had a significant impact. I often have difficulty falling asleep; likely something like Delayed Sleep Phase Disorder. Rather than being awake all night doing unrelated activities,  I often spent that same time interacting with the course material.”

What do we mean by “Student-Designed Curriculum,” and how do we do it?

We’ll describe 3 main techniques from the point of view of the instructor and the students, give participants some time to try one of those techniques, and then take questions.  You can follow the structure of the workshop using the handout, in DOC or PDF format.

1. Question-Generating Exercises

Description and Example (screencast of handout pp 2-3)

A caterpillar named Earl

Earl the caterpillar is a battery powered play-dough-creation with glowing spots and a spinning tail. He was a fruitful question-generating exercise.

We work through question-generating exercises at the beginning of the year, and throughout the year.  A good exercise is one that

  • can be explored independently by students with rudimentary skills (“low floor”)
  • can incorporate knowledge and experience of students with prior exposure (“high ceiling”)
  • allows students to generate questions about the course topics at their own level of complexity

In our field, a good example is “Squishy Circuits” — where students make working circuits out of playdough. In the workshop, participants will have a change to explore Question-Generating Exercises in their own domain.

2. Comprehension Constructor

Description and Example (Screencast of handout pp 6-10)

As the instructor, I assess all student work according to the same criteria, which are format-independent. My criteria are:

  • Do you have two convergent pieces of evidence that back up your point?
  • Are they clear enough to you that you can summarize them?
  • Has at least one of them been reviewed by an expert?
  • Can you connect it to your own experience, in or out of school?
  • Can you share how you visualize or otherwise imagine it?
  • Can you answer the question “how much?”
  • Can you answer the question “what causes it?”
  • Is it coherent with other things you have learned?

Your criteria will be different; the point is to have some, and use them consistently; Cris Tovani calls this a Comprehension Constructor.  This removes the burden of creating a new rubric for every new thing a student decides to do.  In the workshop, we will present example templates and invite participants to design their own.

3. Question-Tracking Spreadsheet

Description and Example (screencast of handout pp. 13-14)

Any question that comes up, either as part of an assignment, or during class discussion, gets added to a tracking spreadsheet.  When students are ready for a new topic, they choose from the spreadsheet.  Of course, it falls to the instructor to decide whether to offer the entire list to choose from, or to triage the questions according to which topics are required for the course and which are optional. I might also filter them according to whether they lend themselves to experiment or research, and break large questions down into smaller parts.

The spreadsheet also allows me to keep track of how well various activities do at generating questions, and on what topics.  That helps me design good Question-Generating Exercises for the beginning of the next year.  I will demonstrate the spreadsheet I use, and invite participants to consider how they might adapt it in their work.

We look forward to meeting you and exploring these topics together.  See you soon!

Workshop Handout


Miniature Guide to Critical Thinking

How I Chose my Criteria for Critical Thinking

Discovery Learning and Teaching, David Hammer

My review of Do I Really Have to Teach Reading, by Cris Tovani

I’ve had 2 workshop proposals accepted at the second Pan-Canadian Conference on Universal Design for Learning (UDL). I’m excited to be co-designing and co-presenting with two students who are extremely knowledgeable about disability rights and disability-based accommodations in educational institutions.  I’m going to use this post to brainstorm two workshops:

  • Student-Designed Curriculum
  • Student-Designed Assessment

Comic showing a worker shovelling stairs. He says: "All these other kids are waiting to use the stairs. When I get through shoveling them off, then I will clear the ramp for you." A wheelchair user replies, "But if you shovel the ramp, we can all get in!"If you’re not familiar with UDL, it emerged in the 90s, inspired by universal design in architecture.  In the architectural world, the goal is to design systems that are as broadly usable as possible, rather than creating “alternatives” to systems that create barriers for some users.  The classic example is about a ramp vs. stairs.  If you build stairs, you’ll need a ramp or an elevator or some other method for wheelchair users to bypass the stairs; but if you build a ramp, both walkers and wheelers can use it (along with those pushing strollers, hauling carts, using crutches, etc.).

Like any educational philosophy, people use it in lots of ways to mean lots of things, some of which contradict each other. One of the main proponents is the National Centre on Universal Design for Learning, which publishes guidelines recommending

  • Multiple means of engagement (i.e. many possible answers to “why am I learning this”)
  • Multiple means of representation (i.e. many possible ways I can access information)
  • Multiple means of action and expression (i.e. many possible ways to show what I know and can do)

I’m simplifying here, and the guidelines have evolved to include more complex ideas about executive function, self-regulation, etc.  You can see a comparison of the three version of the guidelines as they have evolved. It looks to me like the vocabulary has gotten more complex and the order has changed, but the ideas mostly have not.  I think they intend to shift away from assuming that learners need changing, to assuming that curriculum needs to be more accessible… I’m not totally sold that this infographic does justice to that idea.  But that’s ok.  A lot of things have gotten lumped under this umbrella, and I’m interested in a specific subset: creating learning environments where students have as much control as possible.

UDL Vs. Differentiation

Differentiated Instruction vs. Universal Design for LearningThis, for me, is what distinguishes UDL from a “differentiation” approach. Differentiation often focuses on being responsive to student difficulties caused by inaccessible materials, like a print handout; teachers have to create alternate materials (maybe providing an electronic copy) tailored to that student, and someone (probably the student) has to justify the increased work by submitting a documented diagnosis, or everyone would want it…

UDL focuses on removing barriers in the first place.  The vision that excites me is of choosing the most flexible options that inherently afford students to differentiate for themselves.  To give a slightly trivial example, if I provide electronic copies of everything to everyone, those who want print copies can have them; those who use screen readers can use them; those who use a tablet to magnify the document can do that.   That doesn’t mean the instructor no longer needs pay attention; students will still experience barriers that we haven’t anticipated or that aren’t in our power to change (in this example, I’ll have to notice who has high-speed internet access at home and who doesn’t; who has a tablet; etc).  But there should be fewer of them. And it should no longer depend on “proving” that you’re “needy” enough to “deserve” accommodations.  Unlike differentiation, UDL is already there before you show up, before you ask for it.  It becomes a flexibility that benefits everyone.

Standards-based grading is of course a part of that.  I  allow students to decide when they will reassess (within a certain window), in what format, and with what specific example.  They are welcome to write a quiz, but just as welcome to submit a paper, a screencast, a blog post, or a video; the subject can be an experiment, some research, an interview, etc., as long as it demonstrates their mastery of the skill in question.

Loose Means, Tight Goals… But How to Choose the Goals?

To realize the promise of UDL, I have to choose the skills with extreme care, and disaggregate them as much as possible.  Because I teach electronics circuits courses, the skills I have chosen are things like “interpret voltmeter measurements”, and “analyze a capacitor circuit”.  I have carefully removed the format from the skill; it’s not “write a lab report about a capacitor circuit.”  Should my students be required to know how to write a lab report?  Maybe.  But it’s not an outcome in my course, so it’s not a requirement in my assessment.

Can Students Decide What we Study?

I have also experimented with ways to allow students to design the curriculum.  I sometimes call this “emergent curriculum”, since it emerges from interests the students identify. But instructors often use that term to mean that the instructor inquires into the students’ learning and interests, then designs the curriculum accordingly.  I’m interested in pushing the locus of control as far toward the students as makes sense.

At the beginning of the year, I run a bunch of activities to find out what students know, what they wonder, and what they want to learn.  We make play-dough circuits, while they record and pass in a log sheet of ideas they had during the activity.  Another activity is a research prompt: “learn something about atoms you didn’t know before.”  I encourage them to record their questions as well as ideas.  All the question go into a question bank, where I tag them according to student, topic, whether they require research or experimentation, etc.  You can see a sample below; click through to make it bigger.

Spreadsheet sample showing questions students have asked

In the next class, I bring the list of questions; everyone’s assignment is to pick one and research it.  When we head to the shop for our lab period, I bring the same list of questions but filtered for testable questions; everyone’s job is to test one.  Or make up a new one.  I’ll ask them to run it by me, but I’ve never vetoed one; at most, I might insist on special safety precautions, or if the question is going to take all month to test, I might ask them to tackle a small piece of it.

Since the standard they are trying to meet is “interpret voltmeter measurements”, it really doesn’t matter what they measure.  Later standards in the same course are about Ohm’s Law, the voltage divider rule, etc, and it might seem that, at some point, they’d have to stop playing around and turn to Chapter 3.  But that’s the beauty of it. It doesn’t matter what you measure.  Ohm’s law will be there.  That’s why it’s called a law.  In the humanities, maybe you would say “it doesn’t matter what you read, it will have a metaphor in it.” Or “it doesn’t matter what you listen to — it will have chord progressions and cadences.”

Obviously, I triage the questions.  That goes back to the skills I’ve chosen; some of them students must master in order to pass, and some of them are optional extras.  I split the questions up accordingly, and ask people to choose from the “required” section before they choose from the “optional” section.  You can also imagine that there’s some direct instruction going on too — specific instructions on how to use a voltmeter, and required safety precautions.  Those become our class “Best Practices.”  After that, they can measure anything they want, as long as they do it “in accordance with best practices.”

One they’ve generated data, either research or measurement, on their chosen topics, I photocopy a class set of everyone’s results, they break into groups, and see what conclusions they can draw.  That brings up more questions, and off we go again.

In this system, it doesn’t matter what experiments they run, so long as they are about circuits.  I constrain the domain by providing batteries and lightbulbs (DC sources and resistive loads), so that we don’t end up trying to deal with topics from next semester.  Although, sometimes those come up, which is great.  The students can “interpret voltmeter measurements” about that as well.  You wouldn’t believe the extreme examples of heating effects this year’s students discovered!  Never seen anything like it.  And it didn’t matter.

My job does not spiral out of control trying to prepare lessons about every wacky topic they come up with.  My job is to provide a limited set of materials; if a question isn’t testable with those materials, they’ll pick another one.  My job is to photocopy their results; doesn’t matter what the results are about.  My job is to ask questions during their “peer review” sessions, to make sure they notice contradictions.  No matter what they investigate, my job doesn’t change, and the workload doesn’t get any heavier.

Why Present This at a UDL Conference?

I approached two students who are knowledgeable about disability accomodations in educational systems, and they agreed to work with me to design the workshops.  We had a wide-ranging brainstorming session, and these are the topics they thought were most important.

UDL talks about providing multiple ways for students to get motivated.  Rather than “providing” a limited set of motivations designed by me, I’m interested in supporting the infinite range of motivations students already have, based on their pre-existing interests and experience, by letting them build on any topic that relates to the course.

UDL talks about providing multiple ways for students to express their learning.  I’m interested in the infinite variety of ways they will come up with.   Because we have classroom Best Practices about measuring well and thinking well, I don’t have to make up a separate rubric for every different format a student wants to use. The same rubric applies to everything: safety, clarity, precision, causality, coherence with the evidence.


My workshop proposals are below.

  • If you only had 50 minutes to make sense of these ideas, what would you highlight?
  • What do I need to pay attention to so this doesn’t come across as only being useful if you teach science or engineering?
  • How can the format of the workshop itself be informed by UDL?
  • What does this make you wonder about?

Student-Designed Curriculum, With Rigour and Without Instructor Burnout

Can a group of students collaboratively design their own curriculum?  We say yes. One community college instructor and two students, both registered with Disability Services, will present techniques we have worked on together for four semesters. These include activities and record-keeping systems an instructor can use to map student interests and turn significant control of course content over to students.  There will be time for participants to create their own mapping format, or adapt an activity to increase its ability to help students design their path of inquiry.

UDL often focuses on the instructor’s ability to “provide” multiple means of representation, engagement, and expression.  We propose to frame our conversation around students’ power to “determine” those means, and then choose among them.  This vision of UDL puts real control over both format and content of a course into the hands of students.  It means that both instructors and students work explicitly to discover and value students’ pre-existing knowledge and outside-of-class experience, and dove-tails with practices of culturally responsive pedagogy.  We will discuss our experiences of working with class groups who are taking on this responsibility, and share techniques that increase the accessibility of this practice for both students and teachers. This includes design of classroom activities, record-keeping systems for large amounts of unstructured student data, and how to do this even in institutions with conventional expectations about course outlines, etc.  Our work is partly informed by David Hammer’s ideas of “Discovery Learning and Discovery Teaching” (Cognition and Instruction, Vol 15, No. 4, 1997).  We will invite participants to explore where and how these techniques could work in their courses.

Relevant UDL guidelines:

  • Promote expectations and beliefs that optimize motivation
  • Heighten salience of goals and objectives
  • Vary demands and resources to optimize challenge
  • Foster collaboration and community
  • Optimize individual choice and autonomy
  • Optimize relevance, value, and authenticity
  • Activate or supply background knowledge
  • Offer ways of customizing the display of information
  • Offer alternatives for auditory information
  • Offer alternatives for visual information
  • Guide appropriate goal-setting
  • Use multiple media for communication
  • Use multiple tools for construction and composition
  • Vary the methods for response and navigation

How Student-Designed Assessment Can Make UDL Easier for Students and Teachers

Standards-based grading is an assessment system focused on self-assessment and strategic improvement. It encourages everyone to make low-stakes mistakes while experimenting with many formats, and to learn from these mistakes about the content and about themselves. As a team of two community college students and one instructor who have worked together for four semesters, we will describe how this system can increase both rigour and accessibility. We will also provide participants with templates to use in experimenting with SBG in their own courses.

Two students and one instructor, all of whom have struggled with and at times left post-secondary institutions, come together to discuss assessment techniques that transform our experience of formal education.  Standards-based grading (SBG) is not one single system; it is a philosophy that can encompass a variety of strategies, including student-controlled due dates, recognition of prior learning, and most importantly student-designed assessments. We describe how we use SBG to create the greatest possible freedom for ourselves and others in our classroom community.  We also discuss when self-advocacy can decrease accessibility, and what to do instead.  We will invite participants to evaluate their course outcomes, experiment with writing new ones using a rubric for SBG and UDL, and test their choices against imagined alternative assessments.

Relevant UDL Guidelines:

  • Facilitate personal coping skills and strategies
  • Develop self-assessment and reflection
  • Increase mastery-oriented feedback
  • Optimize individual choice and autonomy
  • Optimize relevance, value, and authenticity
  • Minimize threats and distractions
  • Offer ways of customizing the display of information
  • Guide appropriate goal-setting
  • Support planning and strategy development
  • Enhance capacity for monitoring progress
  • Build fluencies with graduated levels of support for practice and performance
  • Vary the methods for response and navigation

Last week, I presented a 90-minute workshop on Assessment Survival Skills during a week-long course on Assessing and Evaluating.  Nineteen people attended the workshop.  Sixteen were from the School of Trades and Technology (or related fields in other institutions).  There were lively small-group discussions about applying the techniques we discussed.

Main Ideas

  1. Awesome lesson plans can help students learn, but so can merely decent lesson plans given by well-rested, patient teachers
  2. If grading takes too long, try techniques where students correct the mistakes or write feedback to themselves
  3. If they don’t use feedback that you provide, teach students to write feedback, for themselves or each other
  4. If students have trouble working independently in shops/labs, try demonstrating the skill live, creating partially filled note-taking sheets, or using an inspection rubric
  5. If you need more or better activities and assignments quickly, try techniques where students choose, modify, or create questions based on a reference book, test bank, etc.
  6. If students are not fully following instructions, try handing out a completed sample assignment, demonstrating the skill in person, inspection reports, or correction assignments

When I asked for more techniques, the idea of challenging students to create questions that “stump the teacher” or “stump your classmates” came up twice.  Another suggestion was having students get feedback from employers and industry representatives.

Participants’ Questions

At the beginning of the workshop, participants identified these issues as most pressing.

New Doc 47_1

Based on that, I focused mostly on helping students do their own corrections/feedback (#3), and how to generate practice problems quickly (#5).  Interestingly, those were the two ideas least likely to rate a value rating of 5/5 on the feedback sheets — but the most often reported as “new ideas”.  I think I did the right thing by skipping the techniques for helping students follow instructions (#6), since that was the idea people were most likely to describe as one they “already use regularly.” Luckily, the techniques I focused on are very similar to the techniques for addressing all the concerns, except for a few very particular techniques about reducing student dependence on the instructor in the shop/lab (#4), which I discussed separately.  I received complete feedback sheets from 18 participants and 16 of them identified at least one idea as both new and useful, so I’ll take that as a win.  Also, I got invited to Tanzania!

Participants talked a lot about what it’s like to have students who all have different skills, abilities, and levels of experience.  Another hot topic was how to deal with large amounts of fairly dry theory.  We talked a lot about techniques that help students assess their skills and choose what content they need to work on, so that students at all levels can challenge and scaffold themselves.  We also talked about helping students explore and choose and what format they want to use to do that, as a way of increasing engagement with otherwise dry material.  I didn’t use the term, but I was curious to find out in what ways Universal Design for Learning might be the answer to questions and frustrations that instructors already have.  If I ever get the chance, as many participants requested, to expand the workshop, I think that’s the natural next step.

Feedback About the Workshop

Overall feedback was mostly positive. Examples (and numbers of respondents reporting something similar):

“Should be a required course”

“I liked the way you polled the class to find out what points to focus on,” “tailored,” “customized” (4)

“Well structured,” “Interactive” (7)

“Should be longer” (11)

“Most useful hour and a half so far” (4)

Feedback About Handout

“If someone tries to take this from me, there’s gonna be a fight!”

Feedback About Me

“Trade related information I can relate to” (4)

“High energy,” “fun,” “engaging,” “interesting” (5)

“You were yourself, didn’t feel scripted,” “Loved your style,” “Passionate” (3)

“That’s the tradesperson coming out!”

Here are the resources I’ll be using for the Peer Assessment Workshop.

Participant Handout

Participants will work through this handout during the workshop.  Includes two practice exercises: one for peer assessment of a hands-on task, and one for peer assessment of something students have written.  Click through to see the buttons to download or zoom.


Feel free to download the Word version if you like.

Workshop Evaluation

This is the evaluation form participants will complete at the end of the workshop.   I really like this style of evaluation; instead of asking participants to rank on a scale of 1-5 how much they “liked” something, it asks whether it’s useful in their work, and whether they knew it already.   This gives me a lot more data about what to include/exclude next time.  The whole layout is cribbed wholesale, with permission, from Will At Work Learning.  He gives a thorough explanation of the decisions behind the design; he calls it a “smile sheet”, because it’s an assessment that “shows its teeth.”

Click through to see the buttons to download or zoom.


Feel free to download the Word version if you like.

Other Stuff

In case they might be useful, here are my detailed presentation notes.

This week, I’ve been working on  Jo Boaler’s MOOC “How To Learn Math.”  It’s presented via videos, forum discussions, and peer assessment; registration is still open, for those who might be interested.

They’re having some technical difficulties with the discussion forum, so I thought I would use this space to open up the questions I’m wondering about.  You don’t need to be taking the course to contribute; all ideas welcome.

Student Readiness for College Math

According to Session 1, math is a major stumbling block in pursuing post-secondary education.  I’m assuming the stats are American; if you have more details about the research that generated them, please let me know!

Percentage of post-secondary students who go to 2-year colleges: 50%

Percentage of 2-year college students who take at least one remedial math course: 70%

Percentage of college remedial math students who pass the course: 10%

My Questions

The rest, apparently, leave college.  The first question we were asked was, what might be causing this?  People hazarded a wide variety of guesses.  I wonder who collected these stats, and what conclusions they drew, if any?

Math Trauma

The next topic we discussed was the unusual degree of math trauma.  Boaler says this:

“When [What’s Math Got To Do With It] came out,  I was [interviewed] on about 40 different radio stations across the US and BBC stations across the UK.  And the presenters, almost all of them, shared with me their own stories of math trauma.”

Boaler goes on to quote Kitty Dunne, reporting on Wisconsin Radio: “Why is math such a scarring experience for so many people? … You don’t hear of… too many kids with scarring English class experience.”  She also describes applications she received for a similar course she taught at Stanford, for which the 70 applicants “all wrote pretty much the same thing.  that I used to be great at maths, I used to love maths, until …”.

My Questions

The video describes the connection that is often assumed about math and “smartness,” as though being good at English just means you’re good at English but being good at Math means you’re “smart.”  But that’s just begging the question.  Where does that assumption come from? Is this connected to ideas from the Renaissance about science, intellectualism, or abstraction?

Stereotype Threat

There was a brief discussion of stereotype threat: the idea that students’ performance declines when they are reminded that they belong to a group that is stereotyped as being poor at that task.  For example, when demographic questions appear at the top of a standardized math test, there is a much wider gender gap in scores than when those questions aren’t asked. It can also happen just through the framing of the task.  An interesting example was when two groups of white students were given a sports-related task.  The group that was told it measured “natural athletic ability” performed less well than a group of white students who were not told anything about what it measured.

Boaler mentions, “researchers have found the gender and math stereotype to be established in girls as young as five years old.  So they talk about the fact that young girls are put off from engaging in math before they have even had a chance to engage in maths.”

My Questions:

How are pre-school girls picking this stuff up?  It can’t be the school system. And no, it’s not the math-hating Barbie doll (which was discontinued over 20 years ago).  I’m sure there’s the odd parent out there telling their toddlers that girls can’t do math, but I doubt that those kinds of obvious bloopers can account for the ubiquity of the phenomenon.  There are a lot of us actually trying to prevent these ideas from taking hold in our children (sisters/nieces/etc.) and we’re failing.  What are we missing?

July 22 Update: Part of what’s interesting to me about this conversation is that all the comments I’ve heard so far have been in the third person.  No one has yet identified something that they themselves did, accidentally or unknowingly, that discouraged young women from identifying with math.  I’m doing some soul-searching to try to figure out my own contributions.  I haven’t found them, but it seems like this is the kind of thing that we tend to assume is done by other people.  Help and suggestions appreciated — especially in the first person.

Interventions That Worked

Boaler describes two interventions that had a statistically significant effect.  One was in the context of a first-draft essay for which students got specific, critical feedback on how to improve.  Some students also randomly received this line at the end of the feedback: “I am giving you this feedback because I believe in you.”  Teachers did not know which students got the extra sentence.

The students who found the extra sentence in their feedback made more improvements and performed better in that essay.  They also, check this out, “achieved significantly better a year later.”  And to top it all off, “white students improved, but African-American students, they made significant improvements…”  It’s not completely clear, but she seems to be suggesting that the gap narrowed between the average scores of the two groups.

The other intervention was to ask seventh grade students at the beginning of the year to write down their values, including what they mean to that student and why they’re important.  A control group was asked to write about values that other people had and why they thought others might have those values.

Apparently, the students who wrote about their own values had, by the end of the year, a 40% smaller racial achievement gap than the control group.

My Questions:

Holy smoke.  This just strikes me as implausible.  A single intervention at the beginning of the year having that kind of effect months later?  I’m not doubting the researchers (nor am I vouching for them; I haven’t read the studies).  But assuming it’s true, what exactly is happening here?

Thanks to all those who participated in the Blended Learning workshop.  Below, you’ll find links to the resources we used in the workshop.  There are also resources for several topics we didn’t have time to explore.  If you have questions, comments, or suggestions, don’t hesitate to let me know, by email or by leaving a comment at the bottom of the page.


Pre-Reading Assignment: Two contrasting views of blended learning.

Cities for Educational Entrepreneurship Trust publishes this website to promote blended learning, including the Rocketship School model.  Watch the video at the top of the page.

Dan Meyer discusses the evolution of the Rocketship model.  Skip the video if you don’t have time — the article speaks for itself.

Blended Learning Basics

This article on Classifying K-12 Blended Learning, sponsored by the Innosight Institute, gives clear definitions of some of the possibilities of what blended learning could mean.

Assessing Blended Learning Techniques

If we change our teaching in the hopes of improving something, how do we check if it worked?  This video about the effectiveness of science videos proposes a few ideas.

Resources on Blogging for Teachers

See the list at left, under “I’m Reading About,” for a list of topics including educational technology, literacy, teaching science and technology, and teaching problem-solving.

Resources on Document Scanning

I’ve written a number of posts about using a phone, tablet, or camera to capture quizzes or assignments, share in-class work on the projector, etc.  See especially The Scanner In My Pocket.

Resources on Flipped Teaching

Does a flipped classroom work better with before-class videos or before-class readings?  What are the pros and cons?  Student Preparation For Class and Khan Academy Is An Indictment of Education should get you started, and lead to lots more resources.

Resources on Mind-Mapping

Maria Andersen uses Mindomo to archive links, store videos, and keep notes about games for learning in every topic from music to astronomy to economics.  I use it for annotating and archiving collections of resources that wouldn’t fit on my computer. Finally, I have an easy way to tag my bookmarks, do parameterized searches, and access them from any online device.

Resources for Reading Comprehension

Here’s the exercise I demonstrated during the workshop, demonstrating the difference between “skimming for the main idea” and “finding the questions.”  I included a handout I use with my students, which you can download and modify.  Helping students notice where they get confused

Some ideas about using reading instead of videos in “flipped”-style teaching.  Includes examples of the kind of thinking students were doing while reading.

Examples of “reading comprehension constructors” I’ve used in class, asking students to give examples, draw diagrams, ask questions, and the ever-popular “vocabulary bingo”.

You can read about these techniques and more in Cris Tovani’s book Do I Really Have To Teach Reading Comprehension.

Resources for Screencasting

Free software for making screencasts includes Jing (download to your PC) and Screencast-o-matic (cloud-based, no download — works well in classrooms).  Here are some screencasts I created — one to introduce a new topic, one to walk through the solution to a math problem.  Neither of those approaches were very successful — students didn’t absorb or understand the information.  On the other hand, screencasts explaining procedures in software have been a big time-saver.

Resources for SmartBoards

Eric has created some how-to videos for getting the most out of your SmartBoards.  If you’re on the NSCC network, you can access them at S:\KI Staff\Sullivan, Eric.

Resources for Making Educational Videos

Dan Meyer makes beautiful videos and gives them away.  He also shares some secrets: use a tripod.  No, seriously — that’s one of the biggest differences between great and awful.  The other is this: use the video to show phenomena, not explanations.  Get the students hungry, then let them ask for the instructions and info.  Here’s an example where he takes a weak textbook problem and shows you how to make it shine.  He writes about math but I suspect this is widely applicable.

I’m presenting a workshop on using Prezi tomorrow.  The agenda includes

  • What is Prezi, and what are its pros and cons?
  • Best practices, including how and when to zoom, pan, or rotate
  • Evaluating a topic’s structure to determine whether it’s best suited to Prezi, PowerPoint, a text document, or another medium
  • Individual experimentation with Prezi
  • Tips and tricks for efficient use

Some of the resources I’ll use are linked here.  I’ll update the list after the workshop with additional resources, as determined by the conversation and interests of participants.

Workshop Examples

Prezi Tutorials

Information Design in General

  • PRISM scandal cheekily reinterpreted as a visual design problem, including before-and-after slide redesign
  • Dan Meyer explains “Kicking Out the Cliche” in classroom presentations.  “Very little that’s worth saying can be disintegrated into staccato bullet points. If I ever found myself tending towards bullet points in any presentation, I’d start massaging them into an essay-style handout.”  Wash it down with this description of how to create great handouts.
  • Presentation Zen: Simple Ideas on Presentation Design and Delivery, by Garr Reynolds, shows techniques that non-professionals can use to dramatically increase the impact of presentation visuals.  Advocates creating handouts instead of putting text on slides.
  • Garr Reynolds (of Presentation Zen fame) explains how to eliminate anything that is not essential to visually communicating your point.
  • David McCandless’s TED talk on The Beauty of Data Visualization shows dramatic examples of how the visual aspects of information design can change our relationship to information

Information Design in Prezi

Since I’m known to experiment compulsively with Web 2.0 and ed-tech tools, I’ve been asked to present a workshop for the campus PD week on blended learning.  This is an interesting tension for me for a few reasons.

Return on Investment Often Too Low

On one hand, I try to give a fair shake to any promising tool or technique.  On the other hand, most of the software, Web 2.0, or gadgets I’ve tried didn’t make it into my ongoing practice.  Reasons include

Bigger Gains from Assessment, Critical Thinking, and Quality Feedback

Although screencasting, “flipped classroom” experiments, and peer instruction have been helpful to me, they have not caused the massive gains in effectiveness that I got from skills-based grading, self and peer assessment, incorporating critical thinking throughout my curriculum, or shifting to inquiry-based modelling.  But, I wasn’t asked to present on those topics; I was asked to help people think about blended learning.  Planning for the workshop has been an interesting exercise in clarifying my thinking.

Blended Learning Is…

People seem to mean different things when they say “blended learning.” Some possible meanings:

Face-to-face meetings, in a group where everyone’s doing the same thing, during school hours, in classrooms, blended with

  • Learning at your own pace
  • Learning in another location
  • Learning at other times
  • Learning that does not have to be done in a specific order
  • Using a computer to learn (maybe online, maybe not)
  • Using an internet-based technology to learn
  • Learning that is customized for the student’s level
  • Learning whose pace, location, time, or order is controlled by the student

It’s hard to have a short conversation about this, because there are several independent variables.  Here are the ones I can name:

  • increasing the level of computerization
  • automating the process of providing students with work at their demonstrated level of achievement
  • increasing the data collected about student skills (naturally, computerized assessments offer different data than teacher observation…)
  • increasing the level of student control, but only in some areas (format and speed, not content)

Are We Doomed to Talk Past Each Other?

The thing I’m finding hardest to articulate is the need to disaggregate these variables.  Some advocates seem to assume that computers are the best (or only) way of adapting to student achievement, collecting data, or empowering students.  The conversation also runs afoul of the assumption that more computerization is good, because young people like computers.

Here’s my attempt at an outline for a conversation that can at least put these questions on the table.  I will provide a list of resources for participants to take away — so far, I’m thinking of including some resources on visual design (probably from dy/dan, as well as The Non-Designer’s Design Book and maybe Presentation Zen), as well as some of the posts linked above.  I’ll probably include at least one piece debunking the assumptions about “digital natives”.  Other suggestions?  If you were just starting to think about blended learning, what would you want to know more about?

The workshop is on Thursday — all feedback welcome.

Before the Workshop

  1. Watch this video about blended learning
  2. Read this blog post assessing the effectiveness of blended learning
  3. Use a feedback sheet to write a summary and keep track of questions that arise, and bring a copy with you to the workshop
  4. Use a GoogleDoc to vote on techniques you would like to know more about


  • Brainstorm in groups: What blended learning techniques have you used, if any?  What questions do you have so far?
  • Gather questions on front board

What is Blended Learning?

  • Explain common definitions
  • Ask group for other definitions
  • Explain common reasons for trying it
  • Ask group for other reasons why someone might try it
  • Each participant identifies advantages/goals they are most interested in working toward, and enters them into a worksheet
  • Discuss in small groups and modify/add to list if desired.

Examples of Blended Learning Techniques

Each presenter discusses the techniques they have used.

Participants take a moment at the end of each technique to evaluate whether it would contribute to their identified goals

How Can We Assess the Effectiveness of Blended Learning?


Each presenter discusses the results they noticed

Your Plans

  • Invite participants to think of something in their teaching that they would like to improve, and consider if any of the tools we’ve discussed can help.
  • Participants explain their plans in small groups, and keep track of questions that come up.
  • Questions added to the class list


Return to any questions that haven’t been answered.


  • Each presenter passes on any recommendations they have for teachers starting to explore blended learning.  Mine:
  • Learn about visual design
  • Practice learning new software — it’s a skill and you can get better
  • Learn to program — it helps you look at computer programs with a more critical eye
  • Check out the resources included with the day’s worksheet
  • Stick around and experiment with these tools if you would like

This just in from dy/dan: Jo Boaler (Stanford prof, author of What’s Math Got to Do With It and inspiration for Dan Meyer’s “pseudocontext” series) is offering a free online course for “teachers and other helpers of math learners.”  The course is called “How To Learn Math.”

“The course is a short intervention designed to change students’ relationships with math. I have taught this intervention successfully in the past (in classrooms); it caused students to re-engage successfully with math, taking a new approach to the subject and their learning. In the 2013-2014 school year the course will be offered to learners of math but in July of 2013 I will release a version of the course designed for teachers and other helpers of math learners, such as parents…” [emphasis is original]

I’ve been disheartened this year to realize how limited my toolset is for convincing students to broaden their thinking about the meaning of math.  Every year, I tangle with students’ ingrained humiliation in the face of their mistakes and sense of worthlessness with respect to mathematical reasoning. I model, give carefully crafted feedback, and try to create low-stakes ways for them to practice analyzing mistakes, understanding why math in physics gives us only “evidence in support of a model” — not “the right answer”, and noticing the necessity for switching representations.  This is not working nearly as well as it needs to for students to make the progress they need and that I believe they are capable of.

I hope this course will give me some new ideas to think about and try, so I’ve signed up.  I’m especially interested in the ways Boaler is linking these ideas to Carol Dweck’s ideas about “mindset,” and proposing concrete ideas for helping students develop a growth mindset.

Anyone else interested?