You are currently browsing the category archive for the ‘Professional Development’ category.
Last week, I presented a 90-minute workshop on Assessment Survival Skills during a week-long course on Assessing and Evaluating. Nineteen people attended the workshop. Sixteen were from the School of Trades and Technology (or related fields in other institutions). There were lively small-group discussions about applying the techniques we discussed.
- Awesome lesson plans can help students learn, but so can merely decent lesson plans given by well-rested, patient teachers
- If grading takes too long, try techniques where students correct the mistakes or write feedback to themselves
- If they don’t use feedback that you provide, teach students to write feedback, for themselves or each other
- If students have trouble working independently in shops/labs, try demonstrating the skill live, creating partially filled note-taking sheets, or using an inspection rubric
- If you need more or better activities and assignments quickly, try techniques where students choose, modify, or create questions based on a reference book, test bank, etc.
- If students are not fully following instructions, try handing out a completed sample assignment, demonstrating the skill in person, inspection reports, or correction assignments
When I asked for more techniques, the idea of challenging students to create questions that “stump the teacher” or “stump your classmates” came up twice. Another suggestion was having students get feedback from employers and industry representatives.
At the beginning of the workshop, participants identified these issues as most pressing.
Based on that, I focused mostly on helping students do their own corrections/feedback (#3), and how to generate practice problems quickly (#5). Interestingly, those were the two ideas least likely to rate a value rating of 5/5 on the feedback sheets — but the most often reported as “new ideas”. I think I did the right thing by skipping the techniques for helping students follow instructions (#6), since that was the idea people were most likely to describe as one they “already use regularly.” Luckily, the techniques I focused on are very similar to the techniques for addressing all the concerns, except for a few very particular techniques about reducing student dependence on the instructor in the shop/lab (#4), which I discussed separately. I received complete feedback sheets from 18 participants and 16 of them identified at least one idea as both new and useful, so I’ll take that as a win. Also, I got invited to Tanzania!
Participants talked a lot about what it’s like to have students who all have different skills, abilities, and levels of experience. Another hot topic was how to deal with large amounts of fairly dry theory. We talked a lot about techniques that help students assess their skills and choose what content they need to work on, so that students at all levels can challenge and scaffold themselves. We also talked about helping students explore and choose and what format they want to use to do that, as a way of increasing engagement with otherwise dry material. I didn’t use the term, but I was curious to find out in what ways Universal Design for Learning might be the answer to questions and frustrations that instructors already have. If I ever get the chance, as many participants requested, to expand the workshop, I think that’s the natural next step.
Feedback About the Workshop
Overall feedback was mostly positive. Examples (and numbers of respondents reporting something similar):
“Should be a required course”
“I liked the way you polled the class to find out what points to focus on,” “tailored,” “customized” (4)
“Well structured,” “Interactive” (7)
“Should be longer” (11)
“Most useful hour and a half so far” (4)
Feedback About Handout
“If someone tries to take this from me, there’s gonna be a fight!”
Feedback About Me
“Trade related information I can relate to” (4)
“High energy,” “fun,” “engaging,” “interesting” (5)
“You were yourself, didn’t feel scripted,” “Loved your style,” “Passionate” (3)
“That’s the tradesperson coming out!”
Here are the resources I’ll be using for the Peer Assessment Workshop.
Participants will work through this handout during the workshop. Includes two practice exercises: one for peer assessment of a hands-on task, and one for peer assessment of something students have written. Click through to see the buttons to download or zoom.
Feel free to download the Word version if you like.
This is the evaluation form participants will complete at the end of the workshop. I really like this style of evaluation; instead of asking participants to rank on a scale of 1-5 how much they “liked” something, it asks whether it’s useful in their work, and whether they knew it already. This gives me a lot more data about what to include/exclude next time. The whole layout is cribbed wholesale, with permission, from Will At Work Learning. He gives a thorough explanation of the decisions behind the design; he calls it a “smile sheet”, because it’s an assessment that “shows its teeth.”
Click through to see the buttons to download or zoom.
Feel free to download the Word version if you like.
In case they might be useful, here are my detailed presentation notes.
This week, I’ve been working on Jo Boaler’s MOOC “How To Learn Math.” It’s presented via videos, forum discussions, and peer assessment; registration is still open, for those who might be interested.
They’re having some technical difficulties with the discussion forum, so I thought I would use this space to open up the questions I’m wondering about. You don’t need to be taking the course to contribute; all ideas welcome.
Student Readiness for College Math
According to Session 1, math is a major stumbling block in pursuing post-secondary education. I’m assuming the stats are American; if you have more details about the research that generated them, please let me know!
Percentage of post-secondary students who go to 2-year colleges: 50%
Percentage of 2-year college students who take at least one remedial math course: 70%
Percentage of college remedial math students who pass the course: 10%
The rest, apparently, leave college. The first question we were asked was, what might be causing this? People hazarded a wide variety of guesses. I wonder who collected these stats, and what conclusions they drew, if any?
The next topic we discussed was the unusual degree of math trauma. Boaler says this:
“When [What’s Math Got To Do With It] came out, I was [interviewed] on about 40 different radio stations across the US and BBC stations across the UK. And the presenters, almost all of them, shared with me their own stories of math trauma.”
Boaler goes on to quote Kitty Dunne, reporting on Wisconsin Radio: “Why is math such a scarring experience for so many people? … You don’t hear of… too many kids with scarring English class experience.” She also describes applications she received for a similar course she taught at Stanford, for which the 70 applicants “all wrote pretty much the same thing. that I used to be great at maths, I used to love maths, until …”.
The video describes the connection that is often assumed about math and “smartness,” as though being good at English just means you’re good at English but being good at Math means you’re “smart.” But that’s just begging the question. Where does that assumption come from? Is this connected to ideas from the Renaissance about science, intellectualism, or abstraction?
There was a brief discussion of stereotype threat: the idea that students’ performance declines when they are reminded that they belong to a group that is stereotyped as being poor at that task. For example, when demographic questions appear at the top of a standardized math test, there is a much wider gender gap in scores than when those questions aren’t asked. It can also happen just through the framing of the task. An interesting example was when two groups of white students were given a sports-related task. The group that was told it measured “natural athletic ability” performed less well than a group of white students who were not told anything about what it measured.
Boaler mentions, “researchers have found the gender and math stereotype to be established in girls as young as five years old. So they talk about the fact that young girls are put off from engaging in math before they have even had a chance to engage in maths.”
How are pre-school girls picking this stuff up? It can’t be the school system. And no, it’s not the math-hating Barbie doll (which was discontinued over 20 years ago). I’m sure there’s the odd parent out there telling their toddlers that girls can’t do math, but I doubt that those kinds of obvious bloopers can account for the ubiquity of the phenomenon. There are a lot of us actually trying to prevent these ideas from taking hold in our children (sisters/nieces/etc.) and we’re failing. What are we missing?
July 22 Update: Part of what’s interesting to me about this conversation is that all the comments I’ve heard so far have been in the third person. No one has yet identified something that they themselves did, accidentally or unknowingly, that discouraged young women from identifying with math. I’m doing some soul-searching to try to figure out my own contributions. I haven’t found them, but it seems like this is the kind of thing that we tend to assume is done by other people. Help and suggestions appreciated — especially in the first person.
Interventions That Worked
Boaler describes two interventions that had a statistically significant effect. One was in the context of a first-draft essay for which students got specific, critical feedback on how to improve. Some students also randomly received this line at the end of the feedback: “I am giving you this feedback because I believe in you.” Teachers did not know which students got the extra sentence.
The students who found the extra sentence in their feedback made more improvements and performed better in that essay. They also, check this out, “achieved significantly better a year later.” And to top it all off, “white students improved, but African-American students, they made significant improvements…” It’s not completely clear, but she seems to be suggesting that the gap narrowed between the average scores of the two groups.
The other intervention was to ask seventh grade students at the beginning of the year to write down their values, including what they mean to that student and why they’re important. A control group was asked to write about values that other people had and why they thought others might have those values.
Apparently, the students who wrote about their own values had, by the end of the year, a 40% smaller racial achievement gap than the control group.
Holy smoke. This just strikes me as implausible. A single intervention at the beginning of the year having that kind of effect months later? I’m not doubting the researchers (nor am I vouching for them; I haven’t read the studies). But assuming it’s true, what exactly is happening here?
Thanks to all those who participated in the Blended Learning workshop. Below, you’ll find links to the resources we used in the workshop. There are also resources for several topics we didn’t have time to explore. If you have questions, comments, or suggestions, don’t hesitate to let me know, by email or by leaving a comment at the bottom of the page.
Pre-Reading Assignment: Two contrasting views of blended learning.
Cities for Educational Entrepreneurship Trust publishes this website to promote blended learning, including the Rocketship School model. Watch the video at the top of the page.
Dan Meyer discusses the evolution of the Rocketship model. Skip the video if you don’t have time — the article speaks for itself.
Blended Learning Basics
This article on Classifying K-12 Blended Learning, sponsored by the Innosight Institute, gives clear definitions of some of the possibilities of what blended learning could mean.
Assessing Blended Learning Techniques
If we change our teaching in the hopes of improving something, how do we check if it worked? This video about the effectiveness of science videos proposes a few ideas.
Resources on Blogging for Teachers
See the list at left, under “I’m Reading About,” for a list of topics including educational technology, literacy, teaching science and technology, and teaching problem-solving.
Resources on Document Scanning
I’ve written a number of posts about using a phone, tablet, or camera to capture quizzes or assignments, share in-class work on the projector, etc. See especially The Scanner In My Pocket.
Resources on Flipped Teaching
Does a flipped classroom work better with before-class videos or before-class readings? What are the pros and cons? Student Preparation For Class and Khan Academy Is An Indictment of Education should get you started, and lead to lots more resources.
Resources on Mind-Mapping
Maria Andersen uses Mindomo to archive links, store videos, and keep notes about games for learning in every topic from music to astronomy to economics. I use it for annotating and archiving collections of resources that wouldn’t fit on my computer. Finally, I have an easy way to tag my bookmarks, do parameterized searches, and access them from any online device.
Resources for Reading Comprehension
Here’s the exercise I demonstrated during the workshop, demonstrating the difference between “skimming for the main idea” and “finding the questions.” I included a handout I use with my students, which you can download and modify. Helping students notice where they get confused
Some ideas about using reading instead of videos in “flipped”-style teaching. Includes examples of the kind of thinking students were doing while reading.
Examples of “reading comprehension constructors” I’ve used in class, asking students to give examples, draw diagrams, ask questions, and the ever-popular “vocabulary bingo”.
You can read about these techniques and more in Cris Tovani’s book Do I Really Have To Teach Reading Comprehension.
Resources for Screencasting
Free software for making screencasts includes Jing (download to your PC) and Screencast-o-matic (cloud-based, no download — works well in classrooms). Here are some screencasts I created — one to introduce a new topic, one to walk through the solution to a math problem. Neither of those approaches were very successful — students didn’t absorb or understand the information. On the other hand, screencasts explaining procedures in software have been a big time-saver.
Resources for SmartBoards
Eric has created some how-to videos for getting the most out of your SmartBoards. If you’re on the NSCC network, you can access them at S:\KI Staff\Sullivan, Eric.
Resources for Making Educational Videos
Dan Meyer makes beautiful videos and gives them away. He also shares some secrets: use a tripod. No, seriously — that’s one of the biggest differences between great and awful. The other is this: use the video to show phenomena, not explanations. Get the students hungry, then let them ask for the instructions and info. Here’s an example where he takes a weak textbook problem and shows you how to make it shine. He writes about math but I suspect this is widely applicable.
I’m presenting a workshop on using Prezi tomorrow. The agenda includes
- What is Prezi, and what are its pros and cons?
- Best practices, including how and when to zoom, pan, or rotate
- Evaluating a topic’s structure to determine whether it’s best suited to Prezi, PowerPoint, a text document, or another medium
- Individual experimentation with Prezi
- Tips and tricks for efficient use
Some of the resources I’ll use are linked here. I’ll update the list after the workshop with additional resources, as determined by the conversation and interests of participants.
- A Prezi adaptation of How the Brain Learns and Remembers that we will use to compare Powerpoint, text, and Prezi
- A PowerPoint adaptation of the same thing
- A text adaptation of the same thing (PDF)
- Prezi basics (screencast, 1:40)
- How to use multi-media like a pro (prezi)
- How to import a PowerPoint presentation into Prezi (screencast, 1:00)
- Prezi keyboard shortcuts (article)
Information Design in General
- PRISM scandal cheekily reinterpreted as a visual design problem, including before-and-after slide redesign
- Dan Meyer explains “Kicking Out the Cliche” in classroom presentations. “Very little that’s worth saying can be disintegrated into staccato bullet points. If I ever found myself tending towards bullet points in any presentation, I’d start massaging them into an essay-style handout.” Wash it down with this description of how to create great handouts.
- Presentation Zen: Simple Ideas on Presentation Design and Delivery, by Garr Reynolds, shows techniques that non-professionals can use to dramatically increase the impact of presentation visuals. Advocates creating handouts instead of putting text on slides.
- Garr Reynolds (of Presentation Zen fame) explains how to eliminate anything that is not essential to visually communicating your point.
- David McCandless’s TED talk on The Beauty of Data Visualization shows dramatic examples of how the visual aspects of information design can change our relationship to information
Information Design in Prezi
- Dan Steer explains criteria to help you figure out how to zoom and pan to reinforce structure of the topic, while avoiding sea-sickness (see his comment below for some more advanced tips)
- A selection of Prezis, some of them professionally-designed
- Designer Maria Andersen and illustrator Mat Moore team up to use Prezi as an artistic medium. See for example “Levers of Change in Higher Education,” “How Can We Measure Teaching and Learning In Math,” and “Future-Proof Your Education“
- The great people at Prometis show us what not to do
Since I’m known to experiment compulsively with Web 2.0 and ed-tech tools, I’ve been asked to present a workshop for the campus PD week on blended learning. This is an interesting tension for me for a few reasons.
Return on Investment Often Too Low
On one hand, I try to give a fair shake to any promising tool or technique. On the other hand, most of the software, Web 2.0, or gadgets I’ve tried didn’t make it into my ongoing practice. Reasons include
- Time spent teaching the tool exceeded time gained for learning
- The tool helped students become better consumers, not better makers or troubleshooters
- The tool caused students to become more engaged in watching, instead of more engaged in doing
- The tool increased students’ interaction with the gadget, but decreased their interaction with each other and physics all around them
Bigger Gains from Assessment, Critical Thinking, and Quality Feedback
Although screencasting, “flipped classroom” experiments, and peer instruction have been helpful to me, they have not caused the massive gains in effectiveness that I got from skills-based grading, self and peer assessment, incorporating critical thinking throughout my curriculum, or shifting to inquiry-based modelling. But, I wasn’t asked to present on those topics; I was asked to help people think about blended learning. Planning for the workshop has been an interesting exercise in clarifying my thinking.
Blended Learning Is…
People seem to mean different things when they say “blended learning.” Some possible meanings:
Face-to-face meetings, in a group where everyone’s doing the same thing, during school hours, in classrooms, blended with
- Learning at your own pace
- Learning in another location
- Learning at other times
- Learning that does not have to be done in a specific order
- Using a computer to learn (maybe online, maybe not)
- Using an internet-based technology to learn
- Learning that is customized for the student’s level
- Learning whose pace, location, time, or order is controlled by the student
It’s hard to have a short conversation about this, because there are several independent variables. Here are the ones I can name:
- increasing the level of computerization
- automating the process of providing students with work at their demonstrated level of achievement
- increasing the data collected about student skills (naturally, computerized assessments offer different data than teacher observation…)
- increasing the level of student control, but only in some areas (format and speed, not content)
Are We Doomed to Talk Past Each Other?
The thing I’m finding hardest to articulate is the need to disaggregate these variables. Some advocates seem to assume that computers are the best (or only) way of adapting to student achievement, collecting data, or empowering students. The conversation also runs afoul of the assumption that more computerization is good, because young people like computers.
Here’s my attempt at an outline for a conversation that can at least put these questions on the table. I will provide a list of resources for participants to take away — so far, I’m thinking of including some resources on visual design (probably from dy/dan, as well as The Non-Designer’s Design Book and maybe Presentation Zen), as well as some of the posts linked above. I’ll probably include at least one piece debunking the assumptions about “digital natives”. Other suggestions? If you were just starting to think about blended learning, what would you want to know more about?
The workshop is on Thursday — all feedback welcome.
Before the Workshop
- Watch this video about blended learning
- Read this blog post assessing the effectiveness of blended learning
- Use a feedback sheet to write a summary and keep track of questions that arise, and bring a copy with you to the workshop
- Use a GoogleDoc to vote on techniques you would like to know more about
- Brainstorm in groups: What blended learning techniques have you used, if any? What questions do you have so far?
- Gather questions on front board
What is Blended Learning?
- Explain common definitions
- Ask group for other definitions
- Explain common reasons for trying it
- Ask group for other reasons why someone might try it
- Each participant identifies advantages/goals they are most interested in working toward, and enters them into a worksheet
- Discuss in small groups and modify/add to list if desired.
Examples of Blended Learning Techniques
Each presenter discusses the techniques they have used.
Participants take a moment at the end of each technique to evaluate whether it would contribute to their identified goals
How Can We Assess the Effectiveness of Blended Learning?
- Show Veritasium’s video on effectiveness of science videos
- Student confidence
- Student enjoyment
- Student performance
Each presenter discusses the results they noticed
- Invite participants to think of something in their teaching that they would like to improve, and consider if any of the tools we’ve discussed can help.
- Participants explain their plans in small groups, and keep track of questions that come up.
- Questions added to the class list
Return to any questions that haven’t been answered.
- Each presenter passes on any recommendations they have for teachers starting to explore blended learning. Mine:
- Learn about visual design
- Practice learning new software — it’s a skill and you can get better
- Learn to program — it helps you look at computer programs with a more critical eye
- Check out the resources included with the day’s worksheet
- Stick around and experiment with these tools if you would like
This just in from dy/dan: Jo Boaler (Stanford prof, author of What’s Math Got to Do With It and inspiration for Dan Meyer’s “pseudocontext” series) is offering a free online course for “teachers and other helpers of math learners.” The course is called “How To Learn Math.”
“The course is a short intervention designed to change students’ relationships with math. I have taught this intervention successfully in the past (in classrooms); it caused students to re-engage successfully with math, taking a new approach to the subject and their learning. In the 2013-2014 school year the course will be offered to learners of math but in July of 2013 I will release a version of the course designed for teachers and other helpers of math learners, such as parents…” [emphasis is original]
I’ve been disheartened this year to realize how limited my toolset is for convincing students to broaden their thinking about the meaning of math. Every year, I tangle with students’ ingrained humiliation in the face of their mistakes and sense of worthlessness with respect to mathematical reasoning. I model, give carefully crafted feedback, and try to create low-stakes ways for them to practice analyzing mistakes, understanding why math in physics gives us only “evidence in support of a model” — not “the right answer”, and noticing the necessity for switching representations. This is not working nearly as well as it needs to for students to make the progress they need and that I believe they are capable of.
I hope this course will give me some new ideas to think about and try, so I’ve signed up. I’m especially interested in the ways Boaler is linking these ideas to Carol Dweck’s ideas about “mindset,” and proposing concrete ideas for helping students develop a growth mindset.
Anyone else interested?
As the year winds down, I’m starting to pull out some specific ideas that I want to work on over the summer/next year. The one that presses on me the most is “readiness.” In other words,
- What is absolutely non-negotiable that my students should be able to do or understand when they graduate?
- How to I make sure they get the greatest opportunity to learn those things?
- How do I make sure no one graduates without those things? And most frustratingly,
- How do I reconcile the student-directedness of inquiry learning with the requirements of my diploma?
Some people might disagree that some of these points are worth worrying about. If you don’t teach in a trade school, these questions may be irrelevant or downright harmful. K-12 education should not be a trade school. Universities do not necessarily need to be trade schools (although arguably, the professional schools like medicine and law really are, and ought to be). However, I DO teach in a trade school, so these are the questions that matter to me.
Training that intends to help you get a job is only once kind of learning, but it is a valid and important kind of learning for those who choose it. It requires as much rigour and critical thinking as anything else, which becomes clear when we consider the faith we place in the electronics technicians who service elevators and aircraft. If my students are inadequately prepared in their basic skills, they (or someone else, or many other people) may be injured or die. Therefore, I will have no truck with the intellectual gentrification that thinks “vocational” is a dirty word. Whether students are prepared for their jobs is a question of the highest importance to me.
In that light, my questions about job-readiness have reached the point of obsession. Being a technician is to inquire. It is to search, to question, to notice inconsistencies, to distinguish between conditions that can and cannot possibly be the cause of particular faults. However, teaching my students to inquire means they must inquire. I can’t force it to happen at a particular speed (although I can cut it short, or offer fewer opportunities, etc.). At the same time, I have given my word that if they give me two years of their time, they will have skills X, Y, and Z that are required to be ready for their jobs. I haven’t found the balance yet.
I’ll probably write more about this as I try to figure it out. In the meantime, Grant Wiggins is writing about a recent study that found a dramatic difference between high-school teachers’ assessment of students’ college readiness, and college profs’ assessment of the same thing. Wiggins directs an interesting challenge to teachers: accurately assess whether students are ready for what’s next, by calibrating our judgement against the judgement of “whatever’s next.” In other words, high school teachers should be able to predict what fraction of their students are adequately prepared for college, and that number should agree reasonably well with the number given by college profs who are asked the same question. In my case, I should be able to predict how well prepared my students are for their jobs, and my assessment should match reasonably the judgement of their first employer.
In many ways I’m lucky: we have a Program Advisory Group made up of employer representatives who meet to let us know what they need. My colleagues and I have all worked between 15 and 25 years in our field. I send all my students on 5-week unpaid work terms. During and after the work terms, I meet with the student and the employer, and get a chance to calibrate my judgement. There’s no question that this is a coarse metric; the reviews are influenced by how well the student is suited to the culture of a particular employer, and their level of readiness in the telecom field might be much higher than if they worked on motor controls. Sometimes employers’ expectations are unreasonably high (like expecting electronics techs to also be mechanics). There are some things employers may or may not expect that I am adamant about (for example, that students have the confidence and skill to respond to sexist or racist comments). But overall, it’s a really useful experience.
Still, I continue to wonder about the accuracy of my judgement. I also wonder about how to open this conversation with my colleagues. It seems like something it would be useful to work on together. Or would it? The comments on Wiggins’ post are almost as interesting as the post itself.
It seems relevant that most commenters are responding to the problem of students’ preparedness for college, while Wiggins is writing about a separate problem: teachers’ unfounded level of confidence about students’ preparedness for college.
The question isn’t, “why aren’t students prepared for college.” It’s also not “are college profs’ expectations reasonable.” It’s “why are we so mistaken about what college instructors expect?”
My students, too, often miss this kind of subtle distinction. It seems that our students aren’t the only ones who suffer from difficulty with close reading (especially when stressed and overwhelmed).
Wiggins calls on teachers to be more accurate in our assessment, and to calibrate our assessment of college-readiness against actual college requirements. I think these are fair expectations. Unfortunately, assessment of students’ college-readiness (or job-readiness) is at least partly an assessment of ourselves and our teaching.
A similar problem is reported about college instructors. The study was conducted by the Foundation for Critical Thinking with both education faculty and subject-matter faculty who instruct teacher candidates. They write that many profs are certain that their students are leaving with critical thinking skills, but that most of those same profs could not clearly explain what they meant by critical thinking, or give concrete examples of how they taught it.
Self-assessment is surprisingly intractable; it can be uncomfortable and can elicit self-doubt and anxiety. My students, when I expect them to assess their work against specific criteria, exhibit all the same anger, defensiveness, and desire to change the subject as seen in the comments. Most of them literally can’t do it at first. It takes several drafts and lots of trust that they will not be “punished” for admitting to imperfection. Carol Dweck’s work on “growth mindset” comes to mind here… is our collective fear of admitting that we have room to grow a consequence of “fixed mindset”? If so, what is contributing to it? In that light, the punitive aspects of NCLB (in the US) or similar systemic teacher blaming, isolation, and lack of integrated professional development may in fact be contributing to the mis-assessment reported in the study, simply by creating lots of fear and few “sandboxes” of opportunity for development and low-risk failure. As for the question of whether education schools are providing enough access to those experiences, it’s worth taking a look at David Labaree’s “The Trouble with Ed School.”
One way to increase our resilience during self-assessment is to do it with the support of a trusted community — something many teachers don’t have. For those of us who don’t, let’s brainstorm about how we can get it, or what else might help. Inaccurate self-assessment is understandable but not something we can afford to give up trying to improve.
I’m interested in commenter I Hodge’s point about the survey questions. The reading comprehension question allowed teachers to respond that “about half,” “more than half,” or “all, or nearly all” of their students had an adequate level of reading comprehension. In contrast, the college-readiness question seems to have required a teacher to select whether their students were “well,” “very well,” “poorly,” or “very poorly” prepared. This question has no reasonable answer, even if teachers are only considering the fraction of students who actually do make it to college. I wonder why they posed those two questions so differently?
Last but not least, I was surprised that some people blamed college admissions departments for the admission of underprepared students. Maybe it’s different in the US, but my experience here in Canada is that admission is based on having graduated high school, or having gotten a particular score in certain high school courses. Whether under-prepared students got those scores because teachers under-estimated the level of preparation needed for college, or because of rigid standards or standardized tests or other systemic problems, I don’t see how colleges can fix, other than by administering an entrance test. Maybe that’s more common than I know, but neither the school at which I teach nor the well-reputed university that I (briefly) attended had one. Maybe using a high school diploma as the entrance exam for college/university puts conflicting requirements on the K-12 system? I really don’t know the answer to this.
Wiggins recommends regularly bringing together high-school and college faculty to discuss these issues. I know I’d be all for it. There is surely some skill-sharing that could go back and forth, as well as discussions of what would help students succeed in college. Are we ready for this?
Michael Pershan kicked my butt recently with a post about why teachers tend to plateau in skill after their third year, connecting it to Cal Newport’s ideas such as “hard practice” (and, I would argue, “deep work“).
Michael distinguishes between practice and hard practice, and wonders whether blogging belongs on his priority list:
“Hard practice makes you better quickly. Practice lets you, essentially, plateau. …Put it like this: do you feel like you’re a 1st year teacher when you blog? Does your brain hurt? Do you feel as if you’re lost, unsure how to proceed, confused?If not, you’re not engaged in hard practice.”
Ooof. On one hand, it made me face my desire to avoid hard practice; I feel like I’ve spent the last 8 months trying to decrease how much I feel like that. I’ve tried to create classroom procedures that are more reuseable and systematic, especially for labs, whiteboarding sessions, class discussions, and model presentations.
It’s a good idea to periodically take a hard look at that avoidance, and decide whether I’m happy with where I stand. In this case, I am. I don’t think the goal is to “feel like a first year teacher” 100% of the time; it’s not sustainable and not generative. But it reminds me that I want to know which activities make me feel like that, and consciously choose some to seek out.
Michael makes this promise to himself:
It’s time to redouble my efforts. I’m half way through my third year, and this would be a great time for me to ease into a comfortable routine of expanding my repertoire without improving my skills.
I’m going to commit to finding things that are intellectually taxing that are central to my teaching.
It made me think about what my promises are to myself.
Be a Beginner
Do something every summer that I don’t know anything about and document the process. Pay special attention to how I treat others when I am insecure, what I say to myself about my skills and abilities, and what exactly I do to fight back against the fixed-mindset that threatens to overwhelm me. Use this to develop some insight into what exactly I am asking from my students, and to expand the techniques I can share with them for dealing with it.
Last summer I floored my downstairs. The summer before that I learned to swim — you know, with an actual recognizable stroke. In both cases, I am proud of what I accomplished. In the process, I was amazed to notice how much concentration it took not to be a jerk to myself and others.
Learn More About Causal Thinking
I find myself being really sad about the ways my students think about causality. On one hand, I think my recent dissections of the topic are a prime example of “misconceptions listening” — looking for the deficit. I’m pretty sure my students have knowledge and intuition about cause that I can’t see, because I’m so focused on noticing what’s going wrong. In other words, my way of noticing students’ misconceptions is itself a misconception. I’d rather be listening to their ideas fully, doing a better job of figuring out what’s generative in their thinking.
What to do about this? If I believe that my students need to engage with their misconceptions and work through them, then that’s probably what I need too. There’s no point in my students squashing their misconceptions in favour of “right answers”; similarly, there’s no point in me squashing my sadness and replacing it with some half-hearted “correct pedagogy.”
Maybe I’m supposed to be whole-heartedly happy to “meet my students where they are,” but if I said I was, I’d be lying. (That phrase has been used so often to dismiss my anger at the educational malpractice my students have endured that I can’t even hear it without bristling). I need to midwife myself through this narrow way of thinking by engaging with it. Like my students, I expect to hold myself accountable to my observations, to good-quality reasoning, to the ontology of learning and thinking, and to whatever data and peer feedback I can get my hands on.
My students’ struggle with causality is the puzzle from which my desire for explanation emerged; it is the source of the perplexity that makes me unwilling to give up. I hope that pursuing it honestly will help me think better about what it’s like when I ask my students to do the same.
Interact with New Teachers
Talking with beginning teachers is better than almost anything else I’ve tried for forcing me to get honest about what I think and what I do. There’s a new teacher in our program, and talking things through with him has been a big help in crystallizing my thoughts (mutually useful, I think). I will continue doing this and documenting it. I also put on a seminar on peer assessment for first-year teachers last summer; it was one of the more challenging lesson plans I’ve ever written. If I have another chance to do this, I will.
Work for Systemic Change
I’m not interested in strictly personal solutions to systemic problems. I won’t have fun, or meet my potential as a teacher, if I limit myself to improving me. I want to help my institution and my community improve, and that means creating conditions and communities that foster change in collective ways. For two years, I tried to do a bit of this via my campus PD committee; for various reasons, that avenue turned out not to lead in the directions I’m interested in going. I’ve had more success pressing for awareness and implementation of the Workplace Violence Prevention regulations that are part of my local jurisdiction’s Occupational Health and Safety Act.
I’m not sure what the next project will be, but I attended an interesting seminar a few months ago about our organization’s plans for change. I was intrigued by the conversations happening about improving our internal communication. I’ve also had some interesting conversations recently with others who want to push past the “corporate diversity” model toward a less ahistorical model of social justice or cultural competence. I’ll continue to explore those to find out which ones have some potential for constructive change.
Design for Breaks
I can’t do this all the time or I won’t stay in the classroom. I know that now. As of the beginning of January, I’ve reclaimed my Saturdays. No work on Saturdays. It makes the rest of my week slightly more stressful, but it’s worth it. For the first few weeks, I spent the entire day alternately reading and napping. Knowing that I have that to look forward to reminds me that the stakes aren’t as high as they sometimes seem.
I’m also planning to go on deferred leave for four months starting next January. After that, I’ve made it a priority to find a way to work half-time. The kind of “intellectually taxing” enrichment that I need, in order for teaching to be satisfying, takes more time than is reasonable on top of a full-time job. I’m not willing to permanently sacrifice my ability to do community volunteer work, spend time with my loved ones, and get regular exercise. That’s more of a medium-term goal, but I’m working a few leads already.
Anyone have any suggestions about what I should do with 4 months of unscheduled time starting January 2014?
I just received a notice from the American Society for Engineering Education about a free online PD project for faculty who teach introductory engineering science. It’s called Advancing Engineering Education Through Virtual Communities of Practice, and they’ve just extended the application deadline to Feb. 8. Participants can choose from these topics:
- Electric circuits
- Mass & energy balance
I can’t tell if you have to be a member of an engineering department, or if it’s enough to teach one of these topics; I can’t even tell if you have to be American. In any case, I applied. From what I can tell, accepted applicants participate in once-weekly online meetings with facilitators who have experience with “research-based instructional approaches” (though they don’t tell you which ones, except for references to “Outcome-Based Education” — which I think of as an assessment approach, not exactly an instructional approach).
I suppose I should be concerned about the lack of details on the website (even the application deadline on the front page hasn’t been changed to reflect the extension), but I’m chalking it up to this being the prototype run, and anyway, the price is right. The informed consent form makes it clear that this is a research project to explore the viability of the model, which is fine by me. It’ll be worth it if it leads to any of these things:
- Working on instructional changes in a systematic way (rather than the somewhat haphazard and occasionally accidental way I’ve been doing it so far)
- Focusing on the specific ways particular instructional approaches play out in circuits courses, not to mention deepening my content knowledge
- Having a consistent group to work with over the course of 6 months (and two different academic years).
It seems to bring together the advantages of something like the Global Physics Department, with the bonus that every meeting will be about exactly what I teach, and the meeting time will be a part of my scheduled workday.
The email I received from the ASEE contains details that are not available on the website, so I’m including it below.
NSF-funded project to develop engineering faculty virtual communities of practice
Engineering education research has shown that many research-based instructional approaches improve student learning but these have not diffused widely because faculty members find it difficult to acquire the required knowledge and skills by themselves and then sustain the on-going implementation efforts without continued encouragement and support.
ASEE with a grant from NSF is organizing several web-based faculty communities that will work to develop the group’s understanding of research-based instructional approaches and then support individual members as they implement self-selected new approaches in their classes. Participants should be open to this new technology-based approach and see themselves as innovators in a new approach to professional development and continuous improvement.
The material below and the project website provide more information about these communities and the application process. Questions should be addressed to Rocio Chavela at email@example.com.
If you are interested in learning about effective teaching approaches and working with experienced mentors and collaborating colleagues as you begin using these in your classroom, you are encouraged to apply to this program. If you know of others that may be interested, please share this message with them.
Please consider applying for this program and encouraging potentially interested colleagues to apply. Applications are due by February 8, 2013.
Additional Details About the Program
Faculty groups, which will effectively become virtual communities of practice (VCP) with 20 to 30 members, will meet weekly at a scheduled time using virtual meeting software during the second half of the Spring 2013 Semester and during the entire Fall 2013 Semester. Each group will be led by two individuals that have implemented research-based approaches for improving student learning, have acquired a reputation for innovation and leadership in their course area, and have completed a series of training sessions to prepare them to lead the virtual communities. Since participants will be expected to begin utilizing some of the new approaches with the help and encouragement of the virtual group, they should be committed to teaching a course in the targeted area during the Fall 2013 Semester.
VCP Topics and Meeting Times
This year’s efforts are focusing on the introductory engineering science courses and the list below shows the course areas along with the co-leaders and the scheduled times for each virtual community:
Co-leaders are Lisa Huettel and Kenneth Connor
Meeting time is Thursday at 1:30 – 3:00 p.m. EST starting March 21, 2013 and running until May 16, 2013
Co-leaders are Brian Self and Edward Berger
Meeting time is Thursday at 1:30 – 3:00 p.m. EST starting April 3, 2013 and running until May 16, 2013
Co-leaders are John Chen and Milo Koretsky
Meeting time is Wednesday at 2:00 – 3:30 p.m. EST starting April 3, 2013 and running until May 23, 2013
Mass and Energy Balance
Co-leaders are Lisa Bullard and Richard Zollars
Meeting time is Thursday at 12:30 – 2:00 p.m. EST starting March 21, 2013 and running until May 16, 2013
Interested individuals should complete the on-line application at https://www.research.net/s/asee-vcp_application_form. The application form asks individuals to describe their experience with introductory engineering science courses, to indicate their involvement in education research and development activities, to summarize any classroom experiences where they have tried something different in their classes, and to discuss their reasons for wanting to participate in the VCP.
The applicant’s Department Head or Dean needs to complete an on-line recommendation form to indicate plans for having the applicant teach the selected courses in the Fall 2013 Semester and to briefly discuss why participating in the VCP will be important to the applicant.
Since demonstrating that the VCP approach will benefit relatively inexperienced faculty, applicants do not need a substantial record of involvement in education research and development. For this reason, the applicant’s and the Department Head’s or Dean’s statements about the reasons for participating will be particularly important in selecting participants.
Applications are due by February 8, 2013. The project team will review all applications and select a set of participants that are diverse in their experience, institutional setting, gender, and ethnicity.