You are currently browsing the category archive for the ‘Homework’ category.
I’m thinking about how to make assessments even lower stakes, especially quizzes. Currently, any quiz can be re-attempted at any point in the semester, with no penalty in marks. For a student who’s doing it for the second time, I require them to correct their quiz (if it was a quiz) and complete two practise problems, in order to apply for reassessment. (FYI, it can also be submitted in any alternate format that demonstrates mastery, in lieu of a quiz, but students rarely choose that option).
The upside of requiring practise problems is eliminating the brute-force approach where students just keep randomly trying quizzes thinking they will eventually show mastery (this doesn’t work, but it wastes a lot of time). It also introduces some self-assessment into the process. We practise how to write good-quality feedback, including trying to figure out what caused them to make the mistake.
The downside is that the workload in our program is really unreasonable (dear employers of electronics technicians, if you are reading this, most hard-working beginners cannot go from zero to meeting your standards in two years. Please contact me to discuss). So, students are really upset about having to do two practise problems. I try to sell it as “customized homework” — since I no longer assign homework practise problems, they are effectively exempting themselves from any part of the “homework” in areas where they have already demonstrated proficiency. The students don’t buy it though. They put huge pressure on themselves to get things right the first time, so they won’t have to do any practise. That, of course, sours our classroom culture and makes it harder for them to think well.
I’m considering a couple of options. One is, when they write a quiz, to ask them whether they are submitting it to be evaluated or just for feedback. Again, it promotes self-assessment: am I ready? Am I confident? Is this what mastery looks and feels like?
If they’re submitting for feedback, I won’t enter it into the gradebook, and they don’t have to submit practise problems when they try it next (but if they didn’t succeed that time, it would be back to practising).
Another option is simply to chuck the practise problem requirement. I could ask for a corrected quiz and good quality diagnostic feedback (written by themselves to themselves) instead. It would be a shame, the practise really does benefit them, but I’m wondering if it’s worth it.
All suggestions welcome!
The teacher’s skill sheet was a success (thanks, Dan). Today was our third day with the first-year students, and my first time explaining skills-based-grading to an incoming class. Our reassessment period is Thursdays from 2:30 – 4:30, so in this morning’s shop class I dropped a skill sheet on their benches and we started using it. By the time I started explaining how I grade this afternoon, they already had a skill signed off.
I handed out their skills folders and the first two skill sheets for DC circuits. You should have seen their jaws drop when I explained that they can choose if, when, and how often they reassess. They asked great questions and gave thoughtful answers. We talked about how everyone progresses, the many ways of getting extra help, learning at your own pace, and the infinite ways of demonstrating improvement or proficiency. They wanted to know what is proof of improvement (required when applying for reassessment), and had suggestions (quiz corrections, practice problems, written explanations). They wanted to know what level 5 questions are, where to find some, and how to prevent them from getting too big. Many of them had ideas in mind already and we bounced those around to see if they meet the criteria (at least two skills, and you have to choose the problem-solving approach yourself, so it can’t be the same as something we’ve done in class).
We talked about how and why you couldn’t get credit for level 4 until you’ve completed level 3. I explained it in terms of employers’ expectations about basic skills. One student explained it back to me in terms of “levelling up your character” in role-playing games. We talked about feedback, from me and from themselves. I gave examples of feedback that does and does not help you improve (“I need to figure out why V and I are different” compared to “I don’t get it.”). We talked about how many points homework is worth (none). My get-to-know-you survey tells me there are a lot of soccer players in the room, so we talked about practices and push ups. “Do you get points in the league standings for showing up to practice? What about for going to the gym?” I asked. Of course they said no. “So why do it if it’s not worth points?” They got this right away. “It helps you win the game.” “It makes you stronger.”
I enjoyed this conversation:
Student A: “So homework is just for learning.”
Me: “What are you talking about? I thought homework was for sucking up to the teacher.”
Student B: “I thought so too. That’s why I never did it.”
Student C: “I thought homework was for keeping kids in their homes at night.”
Once the questions had died down, I gave them a copy of a skills sheet that looks just like the ones I use to assess them, except that all the skills relate to my teaching. I asked them to sign and date next to any items they had evidence that I had done. I did this so I could find out if they really understood how to use the thing. But it had unexpectedly positive side-effects. From a quick glance, they could tell that I was going to get a “failing” grade. It never occurred to me that they would be upset by this.
They had barely started reading when I started hearing gasps. “You’re failing!” someone called out. “Is our assessment of you going to affect your assessment of us?” someone else half-joked. “Of course I’m not passing yet,” I replied reasonably. “It’s the second day of class. There’s no possible way I could have done 60% of my job by now. That’s how it works: you start at 1, then you move up to 2.” I walked around and peeked over shoulders to make sure they got the mechanics of what to fill in where. I stopped a couple of times to talk to people who seemed to have overly generous assessments. “How have I demonstrated that?” I asked.
We reviewed it together. We got to practice technical reading in tiny, learning-outcome-sized pieces. The highly condensed text on a skill sheet changes meaning if you miss a preposition. Another unexpected side-effect: my students had noticed me doing things that I hadn’t noticed myself. They had evidence to support most of their claims, too. There were a few that I disagreed with because I had only demonstrated part of the skill, and I modelled the kind of feedback that my “teacher” could have given me to help me improve.
Overall, they seemed very concerned about my feelings about “failing;” we calculated my current topic score at 0.5/5 and filled in the bar graph on the front of the skill sheet with today’s date. I got a chance to model a growth mindset. I made sure to let them see how proud I am of having achieved a 0.5 in only two days’ work, and mentioned that this is an improvement over two days ago, when I had a zero. The usual running commentary of tongue-in-cheek jibes had a disarmingly earnest, reassuring tone. “I know that you can improve your score the next time you reassess,” one student said. Another student chimed in with “feel free to drop in to my office anytime if you want to get some feedback.”
As I get ready to launch into my second September, I’ve gone over the feedback from last spring. If you’ve read since the beginning, you know that last December, half of my class was failing and the rest were bored. There was a lot of “why do we have to learn this?” and “is this on the test?”
By the end of this semester, no one failed, and there were some remarkable changes in our classroom culture. One of my colleagues said “when I check labs now, they show me which findings they think are important, instead of waiting for me to tell them what important things they should have found.”
I did some informal evaluations (I stole these questions from Robert Talbert at Casting Out Nines, and they worked well.) I started getting feedback that sounded like this.
What do you like/dislike about the grading system?
Like: Keep trying skills until you understand it
I’ve actually grown pretty fond of the skill system. I like that you actually make us demonstrate our knowledge of the individual skills, it actually helps me remember better sometimes, specially when going over quizzes. The only thing I don’t like is that to get a skill checked off, mainly in the shop, it can take a long time.
The grading system works very well although I think using the skills for every aspect of the course is a little too flexible. Using skills for the lab and going back to regular marked assignments. I need more room, I will talk to you later.
Skills for quiz bad idea. I had no ambition to study for test/quizzes. I like the shop skills tho.
I dislike the unstructured feel of it, simply because I do better with the assignments/tests, but I do like the ability to retest on a skill if you don’t get it the first time.
Independent learning project was fantastic and incredibly valuable in the long run.
I really appreciate you trying something new, and already there is a huge improvement. I hope you continue to innovate and improve the system.
I think the skills are very straight forward, they let us know exactly what you’re looking for.
It all encourages independence, which is great, unless you’re unmotivated.
There needs to be more communication.
Without marks to fuel my ego, I lost my drive to excel.
I think it helps focus more on the important stuff, and less on just completing useless lab stuff.
I was able to learn more with a smaller [work] load. This gave me time to play and experiment, by approaching labs in a way that was helpful to me.
Yes. It’s taken the good parts out of the lab book and made them easy to learn.
It certainly kept me on my toes to make sure that I understood what was needed to do the labs and the tests.
Yes. Previously, I would be missing a small piece of the “puzzle,” this way I know what I need to do.
What do you LOVE about this course?
A lot more feedback this semester, understand concepts easier
The learning environment, the flexibility…
I love that I am actually doing well in this course…
The ability to work at your own pace (even though you have to remember not to procrastinate)
Designing my own labs
I feel that education has in general become stagnant, and I was delighted to have a teacher who was willing to try something new. I know this takes courage and a lot of hard work. Having 25% of my mark based on a project I was able to pick and have it graded in a way that suited me was a blast.
All the freedom
The instruction and the easy feeling that one understands what is being taught.
I liked the independent learning project, even if I had been a bit too ambitious in my designs and dreams
What do you HATE about this course?
I wouldn’t say I hate anything really except there’s a lot of work sometimes.
Quizzes! don’t do well on them, if get one part wrong, all wrong
Other students asking questions on things we have already covered in class, then interrupting the instructor when trying to respond
If you could change ONE THING about this course, what would it be?
More level 5 questions on tests. It is necessary to go above and beyond to get 100% on most modules.
Give assignment due dates.
More availability with students during lab time.
Harder deadlines, required milestones for the self-directed project
Level 5 questions: being bonus because sometimes difficult or busy time schedules to get one ready and do research
Include marked assignments somehow
To have a mix of skills and assignments
Points for homework so I’m more motivated to do it
More hands-on and practicing circuits
Any other comments about the course or the teacher?
Keep on getting better, you are doing a service to your students by furthering education.
I really enjoyed the year. I just wish we had the skill program for the first year as well.
I like this semester better than last semester. Keep up the good work!
My students are awesome, and almost as invested in developing me as I am in developing them.
Students really get reassessment. Not a bad place to start when introducing the “sales pitch.”
They want more feedback, and they’re asking for it explicitly. This is fantastic. I require work samples as part of an application for reassessment now, so that should help. I’ll also be experimenting with BlueHarvest.
Reassessment changed the concept of “studying.” I think this is a good thing. I suspect that what they mean by “study” is “do a long series of identical problems until you’ve got the procedure memorized,” and I’m ok with letting go of that. At the same time, I need to spend more time helping them learn to test themselves, so that they’re not relying exclusively on my tests as a way to diagnose and learn.
It made them look hard at who they are, what they want, and why they do what they do. I need to be ready for that. Students probably could use some preparation for it too.
It exposed the squirming, seething reality of the differences between my expectations about teaching and their expectations about learning. Dan Goldner’s got a great idea about how to clarify what the teacher’s job is, and I’m going to try it.
But hands-down the most fascinating thing that happened this past semester was that my students begged for homework. Many interesting conversations ensued (post about this forthcoming). Removing points for homework may have been the single most useful thing I did all year. To be continued.
Some surprising conversations happened during the first week of the semester.Why Don’t You Just…?
In AC Circuits, I gave the students a list of all the controls on their scope. Mission: find out what they do. They got them all except for the difference between AC and DC coupling (notoriously difficult for beginners to understand). By the end of the class, not only had most people applied the idea to a measurement, but several students proposed alternate ways to do it: “Couldn’t you just adjust the volts/div?” “Couldn’t you just adjust the vertical position?” Understand, too, that this was not coming from the advanced students who’ve had a scope in their basement for years — it was the students who until last week were treating their oscilloscope like an angry, injured wolverine. The following day, a student was demonstrating some oscilloscope skills with lots of confidence. I asked if she still saw the scope as a rabid animal. “No,” she replied, “it’s a fluffy little bunny. It’s only occasionally badly-behaved.”
They talked to each other. They strategized. They struggled. They noticed effects they’d never noticed before and gave them names (AC coupling is now called “the bounce effect”). In the following lab, the students invented the three diode approximations. I was just about to open up some questions about how to analyze diode circuits when a student cut me off to say “why don’t you just” use the knee voltage? And lo, the 2nd diode approximation was born again, for the first time.
Theory: the skills list seemed to help them cut loose and experiment. They spent all afternoon turning knobs just to find out what would happen.
Three students turned in homework (not required) — sometimes the same homework over and over. In each case, it helped me figure out what misconception was holding them back.
One of them redid his practice problem and showed me his new answer, which was correct. He asked if he had to pass it in. I said no. His answer: “Good, I have to go check my skills list and see what this proves.” My goodness, thinking about the meaning of practice problems?
Some students are working through the lab book during shop time. They keep a constant eye on the skills list to figure out what’s important. They could do this just as easily by reading the purpose — that thing printed in bold on the first page of the lab. Reality: they don’t read the “lab purpose.” They do read the skills list. Someday I’ll have a clear theory about why.
Most students are not working through the lab book. They’re picking out one skill at a time and trying to find a shorter, easier way to prove that they can do it. In the process, they are designing experiments. Sometimes they get to the end and realize that so many variables changed, they can’t demonstrate any one thing. They go back and do it again, with controlled variables. Seriously. Several times it has ended up being the same circuit as in the lab book. On two occasions it resulted in burnt resistors. Good conversations resulted.
They’re even tackling “quiz-type” skills in the shop. I give them no guidance on this. The upshot: they create test questions for themselves, either on paper or by finding ways to translate paper skills into hands-on circuit skills. Again, nothing was stopping them from doing that before. In fact, last semester, bribed and bartered and begged them to do just that. This semester I haven’t even had a chance to bring it up.
On Friday, a student asked for the next topic skills list. That’s never happened to me before. He proposed to integrate this unit’s skills with the next unit in order to get a 5. I accepted.
Gaming the System
The “game layer of education” mostly makes me feel like someone’s dragging my teeth across a chalkboard. But this is something I should have learned from casual gaming years ago: make the first few tasks very short. It sucks people in. Note to self: find 1-2 short, simple skills that most students can complete without any help. List them first on the skills sheet.
Our first quiz was this week. We reviewed it right away. Several students were surprised to find out that this “counts.” They asked me when the “real test” was. I asked them what they thought was a better basis for their grades than their skills?
Here are the heckles from the peanut gallery inside my head. Sometimes the voices sound like the irascible, cranky teacher I’m destined to become; sometimes they sound like students.
Q. In this hare-brained scheme of yours, scores of 4 and 5 come from different problems. You’ve also decided to let scores go down, so you can measure retention. What happens when you give a test with a 4/4 question on it, and a student already has a 5? They get it perfect, and their score goes down?
A. Not great idea: I could make sure all tests have both a 4 question and a 5 question for each skill (lots of work for me, longer tests for students, and creates another quandary: what to give the student who nailed the 5/5 question and bombs the 4/4)?
Less bad idea: once you have 5/5, you don’t have to assess those skills anymore. Pro: motivates students not to let it sit at a 4. Con: they drop it from memory after getting a 5.
Better idea: Do as above but give a final exam worth 20%. If you bomb skill X on the final, it doesn’t change your “skill score” of 5/5. Decent compromise? Does it give the student the info they need for their next course? Does it give the next teacher the info they need? So far, I think so…
Q: You say you want them to do something to prepare for the assessment. Is it good enough to just tell you what they did?
A: No. My students have mad finesse for detecting unenforceable rules. Sometimes I think they flout those rules just to show their disdain for what they see as hypocrisy (even if I see it as “the honour system”). If I’m serious about making preparation necessary for assessment, I need to actually have proof. Should they just show it at the door? I couldn’t possibly even tell what it was . Should I require that they pass in “something”? That entails passing it in early enough for me to write some comments and get it back… not to mention constant vigilance that students who didn’t pass in any prep item don’t sit the test… and now it’s a way to get out of class. And tests. You can just retest later, right? And now the process is getting slower when I want it to get more agile.
*sigh* I think I’ve talked myself out of it. Best-case scenario: we have time in class for them to practise all the skills for the test, so I can give feedback that way. Chances of this happening: remote. I seriously need to look into videorecording some presentations, and assign those… Worst-case scenario: they don’t practise, they come for assessment, they do badly, feel stupid, are too upset to come see me about it. All I can do is keep an eye out and approach them when I feel it’s needed.
Q: You ripped off the idea for those little squares on the skills sheet, and you’re going to have a bad day when someone’s score goes down after they’ve coloured it in with glitter pen!
A: Yeah. Ok, I’ll probably take the squares off. There’s been some interesting talk about pros and cons of students tracking their score history. In earlier grades I think it’s probably just depressing to see that you’ve attempted something 10 times without progress. But by the time they’re in vocational training, I think it’s time to a) be able to handle that truth about yourself and b) use the results to troubleshoot. (No progress over 3 assessments? How do you prepare, and how could you prepare differently?) It can lead into a conversation about learning styles and their ability to tailor their practise.
So out go the little squares, and instead I’ll have them record prep methods and assessment results in their work-record books. If they’re not up to date, that’s the first thing that we’ll do in a tutoring or reassessment session.
Q. You really covet that feedback sheet from MeTA Musings, don’t you?
A. Yep. So I made one.
Q. You stole the words across the top.
A. It’s true! I confess! I stole everything! *sob*
Um. But I can’t remember from whom. I think it was Sue Van Hattum in a comment she wrote on someone else’s blog. If anyone knows, fill me in, ok?