I've had a few interesting conversations regarding skills students need in order to be successful but don't actually make it into the gradebook. You know, the homework-organization-studyhabits-notetaking type skills also known as "soft skills." I noticed that Shawn Cornally has "Investigation Standards" he uses in his class which makes me question the existence of a third set of skills that aren't necessarily content driven nor are they "soft." I suppose one could argue that investigation standards are appropriate to a science class as they are often times embedded into the content skills. But are there skills that are embedded into a math classroom that may or may not actually be figured into the grade of a student who has a teacher using SBG?
Where do things like: applying problem strategies, using multiple representations (Rule of 4) or showing multiple ways to work a problem fit into the grade book? These are the golden threads that run through all of our math courses so why wouldn't we measure them?
Do they deserve their own place in the gradebook? If so, how many of them are there and what are they?
Thursday, April 15, 2010
Subscribe to: Post Comments (Atom)
The provincial math curriculum in B.C. includes a set of seven 'mathematical processes' that are meant to be included throughout the content lessons:
- problem solving
- connections (ie. making connections between content areas, and to the world outside of the classroom)
- mental math and estimation
In some ways these are great; in other ways they're kind of wasted breath exactly because there's no expectation that these will be directly assessed. There's no accountability, either for the student or the teacher, that they will achieve mastery in any of these processes.
Also, they're often short-changed by the very people who are writing these guidelines. I heard a presentation where estimation and mental math was reduced to using estimation skills to check your work on a written problem. That's one *small* example of the kind of real estimation and numeracy we should be teaching, but it does nothing to help build up numeracy and estimation skills for problems where the numbers aren't already written down in front of you. (ie. the ones where estimation matters the most!)
There's also the huge skill of "bookkeeping" (not losing negative signs from one line to the next), "basic arithmetic," or "using your calculator," etc.
When it comes to my algebra 2 kids, I really do think they understand the concepts behind adding rational functions, but they all miss things based on being careless with their work. You always hope that by the time they're juniors or so that they are careful about that, but--heck, even I might miss a question or two on an algebra 2 test for one of those reasons.
So, when I test for the concept of "adding rational functions" but they just leave off a term or write 2x * 4x = 6x, but they seem to have the rest of the concepts down, should I really be marking them lower on their "adding rational functions" concept grade?
I think if/when I get around to actually doing SBG in my classroom, I think I'll have to have a category for "MISC" stuff like that. On a each quiz you start out with 5 of 5 points in that category and "little" errors like that will bring down that score. Is that how it's being done with other SBG teachers?
Do you think that the teacher should assess those standards, then? Would it add credibility? I think sometimes we bury things in the rubric and students don't ever really know how to fix it.
If I am assessing a particular skill and a student shows they understand the concept and the process but their arithmetic lets them down (ie. "bookkeeping"), I don't see a problem with saying they're proficient (4/5). I think you may find that the actual implementation of SBG is pretty varied.
Shawn and I are working with some other teachers in our building in sort of a grassroots movement towards a systemic standard-based grading change. We've got a half dozen teachers plus the two of us which represents close over 10% of our staff, not bad, eh?
David's question about adding "soft skills" to the list of reported standards has come up several times. Initially, I was on the "let's only report out content-specific standards" boat, but I'm starting to lean a different direction. For example, the P.E. teacher might report out a students' responsibility via the number of times he/she dresses out. Unless there's a separate citizenship grade (which we don't have currently), it seems worthwhile to report out responsibility, character, etc. is some sort of fashion.
I know this doesn't *directly* hit on what you wrote about, David, but still seems like it could be part of the discussion. It's a heck of a lot better than hiding this stuff in the midst of the grade in the traditional system - grading pollution at its best.
We've considered something similar at our school. Our parents have access to look at the online gradebook for their kids, so they know what they got on each assignment, etc. So, we considered having places in the gradebook for some of the "softer" skills and scoring them on like a 1-5 scale, but when it comes time to calculate averages for the grading periods, we count that section as 0% (like you would count homework 10%, quizzes 30%, etc.). So, parents and others get feedback whether they're doing the things they're meant to, but it doesn't actually affect their grade in the class. Then, depending on how devious you are, you can tell the kids about it not counting or not. In the end, they see the score for it and even if it doesn't really count, many try to get the 5's anyways.
I like that idea..weighting it at zero. I'll throw that around and see what we can work out. Thanks!
I like Dave's comment about reporting "soft" skills so students and parents know how they are performing. The idea of weighting them as 0% is what I am doing for homework this term. I take homework, provide feedback to the students and report it in my online grade book so parents see how their children are fairing on the homework.
I have even gone as far as to not call it "homework" any more. I call it "practice". I have had the discussion with my student that Math is like a sport. The more you practice a free throw or a slap shot the better you get at it. The reason we do "practice" in math is that we are trying to get better at those math skills. I call my students "Mathletes".
I have had the discussion with my students that where the "get the marks" for the practice is on the summative assessment at the end of the unit. That end of unit exam is their demonstration of what they know after having had a considerable amount of time to learn, practice and get better at those Math skills.
I posted on that a couple months ago too here. Quick sum: Use them if you want but keep them separate.You've got to put as much effort into clearly defining them as you do with your academic standards. Make sure you actually directly TEACH that. It's not sufficient to assess it and assume they'll pick it up along the way.
Sometimes, I think you can specifically embed that into your academic standards. "Solve X kind of problem using Y method and Z method" I think you use a five point scale, so perhaps a 4 would be Solve X kind of problem. The 5 would be Solve X problem using three different methods.
I think that would be equivalent to how in science I ask them to say, calculate speed on paper, but also require them to calculate the speed of something in real life.
This is just a long way of saying that if you want to teach it, go for it. I'm going to add the caveat that part of the joy of SBG is that an A in Mr. Cox's class will represent the same amount of learning as an A in Mr. Buell's class. You should definitely discuss this with your department when you get to that point.
Not sure why I didn't think of it before, but I actually did something similar to what it seems like you're asking about. Two years ago I tried including a standard called Science Literacy. For mastery they had to be able to read a science newspaper article and determine the claim and supporting evidence. They also needed to evaluate the validity of the source and the type of evidence that was given.
I actually abandoned it midyear.
The problems: I underestimated the class time it'd take for me to directly teach most of what I wanted. When I got behind I also realized that while I valued those abilities, I didn't actually make it a part of my standard curriculum. I never asked them to evaluate the evidence for the existence of the atom (sadly) and the standalone aspect of it made it seem too separate from my standard curriculum.
That's definitely one of the reasons I'm such an advocate of "If you're going to assess it, you better teach it." It made me realize that although I value certain things, it doesn't necessarily show up beyond lip service.
I like Calculus Dave's idea about weighting as 0% also. I used to track some behavioral expectations, like turning in homework on time, getting to class on time, making sure there were no candy wrappers in your desk, etc., and mail merge these "scores" onto individual progress reports every few weeks. It often helped students realize the cumulative effect of their actions (and that I was watching) and was a great way to reinforce whatever behavioral expectation I was stuck on at the moment. I imagine you could do this with skills as well.
I like Dave's idea as well. I'm not sure how to measure the behavioral things, though.
My main concern is regarding the separation of skill acquisition (what and how) from skill application (when and why). I'm trying to flesh out what exactly I want to emphasize with my students and then measure it.
Post a Comment