Wednesday, April 21, 2010

Taxonomy

In the age of accountability, we have become increasingly skill based. That's not necessarily a bad thing. But I've noticed that with each benchmark, common formative assessment or state test we give, teachers have the tendency to focus on the product and not on the process (ie. let's give them the skills and pass 'em down the hall). I encourage inquiry with my students. I allow them to fall down the rabbit holes we encounter. In fact, I've been known to scrap the lesson for the benefit of a good question a student has asked. Most of us do. But the one thing that I've noticed as well is that even my grades have become increasingly skill based. I've been wondering how to assess the problem solving stuff. A colleague clarified some of this for me the other day at lunch. We were discussing the idea that once a student has been shown how to work a problem, it then becomes an exercise and no longer carries with it the attributes that allow a student to be a problem solver. He shared with me that he noticed how his students would ask for an example every time he tried to get them to do some problem solving and how that just didn't sit right with him. It doesn't sit right because the student is trying to take a problem and apply the teacher's algorithm to it. If the example is close enough, he can plug and chug, get an answer and we'll be none the wiser.
If we can get our kids to be proficient with the skills, show the ability to put them together in a problem solving situation and occasionally surprise the pants off of us with one of those "wow-look-what-you-did" moments, I'd say we'd call it a good year. This continuum can be tricky, though. The more time a student spends wrestling with the skills, the more time we need to be there for support. As a student becomes proficient with the skills, we can let go a little more and become a facilitator. I think that many teachers get frustrated because they hear of these great things that are going on in the classrooms of others and wonder why it won't work in their room. Well, maybe it will. But maybe it's because we've put the kid on a mountain bike before he can ride the tricycle. Can't expect a kid who is buried in skill acquisition to blow the doors off you with his inquiry methods--which isn't to say that the student doesn't have the ability to inquire, it may just need some dusting off.

At the end of the year, I'd like my students to be able to duplicate skills, apply skills and create something using the skills.


Duplication aka: Skills

Description: Here's the tool. Here's how you use the tool. Show me you know how to use the tool. Do it again, I don't believe you. One more time, just to be sure. Alright, you are now allowed to use this tool whenever you see fit.

Essential question: Can you use the tool?

How: How these skills are taught may vary. Direct instruction, investigations, Socratic method...I don't care. At the end of the day, the student is acquiring a skill. The more of the lifting they can do themselves while acquiring these skills the better because they are preparing themselves for the next steps of application and creation. Asking questions is going to be the teacher's best friend.

Assessment: Skills tests in the form of multiple choice or free response. Students are free to reassess at any time during the year as their understanding of the skill changes. This is why I don't have a problem with posting study guides and online examples. There are no surprises on my skills tests. Each different skill corresponds to a different assignment in the gradebook. I'm not really too concerned with how they arrive at the answer as the focus is on the product.

We don't give finals in our middle school, but next year I am going to give a summative test every 6 weeks or so that cover all skills to date. For example, the first assessment will cover skills 1-8, the next will cover skills 1-19, and we will end with a cumulative skills test covering skills 1-39.

Application aka: Problem Solving

Description: I want you now to build something. Choose which tools you will use. Make sure your final product looks like *this*.

Essential question: Can you put more than one tool together to do something you haven't done before?

How: Give students a problem that uses skills they already have and walk away--sometimes literally. In this process, I think the students should generate more of the questions--although prodding them along with a question now and then isn't a bad thing. Be careful though, I have found that sometimes I have to take a physical posture of audience member by actually sitting down and looking at the floor in order to cease being the primary resource.

Assessment: This can be in the form of a teacher created project, short assessments requiring students to demonstrate their thinking or even an observation during class. I've thought about including higher level problems in my skills tests and students who successfully solve the problems earn a 5, but then I run into the problem of reassessment and context. The skills test will often give a context to the problem that we may not want the student to have. Reassessment on this will be different because it isn't about the particular problem, but the process by which the student attacks the problem. I'd argue that the skills necessary are irrelevant as long as the students being assessed possess the skills necessary to solve the problem. If this is a written test, it should contain problems the students have never seen before. So I'm thinking about a "Problem solving" weighted category or maybe a weighted standard that is dynamic the same way the other standards are.

Creation aka: Projects

Description: Now, what would you like to build? Design it. Plan it. Build it. Reflect on what you built. Did you choose the right tools? What would you do differently next time?

Essential question: Do you understand your set of tools well enough to recognize what types of things you can and can't do?

How: Student generated projects. My 7th graders are working on a relations project where they choose two variables to compare, gather data and investigate the relationship. Final product will include multiple representations of their data and a presentation using the medium of their choice. (I'll blog about it after we finish state testing.) I really like what Shawn Cornally is doing with his physics kids. If anyone can help me figure out how to do this with some precocious middle school math students, write a book and I'll buy it.

Assessment: The project is the assessment.

Creation > Application > Duplication

Success in any of the higher levels validates the lower levels. For example: If a student can demonstrate problem solving ability by writing an equation and solving it, then not only does this affect the problem solving score, but it validates the equation solving skill. If a student can create her own project, then I'd say problem solving looks pretty strong as do any skills that were present in the project.

The Book
The worst part of all this is that we gotta give it a grade. How do we arrive at a final mark? This year, I left my grades uncalculated until the marking period opened and once grades were submitted, they went back to being uncalculated. I wanted parents and students to focus on the score for each skill and not the "averages." To arrive at a final mark for the students, we look at our rubric the same way one would view a grade point average.

4.5-5.0 = A
4.0-4.4 = B
3.0-3.9 = C
2.0-2.0 = D
< 2.0 = F

This was simple because there were no weighted categories. But now with the fact I want to focus on problem solving, projects and include summative tests, I need to figure out how this should look in the grade book.

Questions

  • Where does problem solving show up in your gradebook?

  • Should the scores on the summative tests have their own category in the gradebook or should each skill be treated like it's own reassessment?

  • Have I lost my mind?


      14 comments:

      keninwa said...

      I'll take a second just to answer your last question - NO! This is one of the most cogent blog entries I've read in a while. You definitely have the wheels in my head turning...

      David Cox said...

      Never been called cogent before. I'm glad to contribute to the turning of the wheels. If you come up with some push back, hit me up.

      Sarah said...

      I've been struggling with this for quite a long time. How are we deliberately teaching and accessing the process standards? I need to process for a bit and respond more in depth. I as so glad to begin this important conversation.

      Anonymous said...

      I'm in Georgia, YMMV.

      In addition to content standards (student will be able to solve a quadratic equation of the from x^2 + bx + c = 0), we also have process standards (student will represent math in different ways).

      My current push is toward standards-based grading at the exclusion of all other grades. Mix that with this post, and I think I've found a new grade in my book: process standards grades.

      Can you do something similar?

      Sarah said...

      I realize I didn't reread my last comment very carefully... oops.

      should say teaching and assessing
      should say I am...

      I think you are on the right track with developing a rubric to assess Problem Solving. It would have to focused on the process of problem solving and not the content of the problem. That way your feedback is specific to their growth in their problem solving. The next big question is what would the rubric look like. What do good problem solvers do? I completely agree that the problems you use for this must be non-routine and I would add have to have multiple entry points or strategies. I wonder what problems we are already using and how we can tweak it to work.

      I need to think a bit more about the create part...

      Malyn said...

      A well-written post in terms of content and structure.

      This reminded me of an essay I wrote about literacy and its implication in the teaching (and learning) of mathematics. If you do choose to follow this link, be warned that this was a uni (College) essay and so employed academic-speak. Still, it supports your thoughts on the importance of Application and Creation in your Taxonomy.

      It also explores the notion of multiliteracies as it applies in maths in the sense that students can have different functional literacy in different areas of maths, e.g. someone could be good (or at least functional) in Geometry, esp. with technology, but not necessarily so for Stats and Probability.

      Sadly, when we teach maths we often teach what has been abstracted from generations of math'al knowledge. Giving students the opportunity to apply these skills certainly help. But letting them abstract from their experience and apply to a new or different context provides deepest learning and a better chance of retention.

      I'm keen to see how your project goes.

      Joan said...

      Gosh, you ask good questions.

      I don't have an answer to it, but a thought.

      How do literacy teachers grade reading comprehension?

      I, too, did some papers in college on literacy, but mine was about metacognition - the ability to know what you know. Turns out metacognition is an important skill for both reading and problem solving in math. So, maybe literacy teachers could tell us how they grade/assess metacognition in reading and there would be some ideas of how to assess problem solving?

      As I said, a thought.

      Ms. A

      David Cox said...

      kalamitykat,
      I think that's exactly what I'd like to do.

      Sarah
      I'll post a suggested rubric soon and would definitely expect push back on that as well. What do you think it should look like?

      Malyn
      Thanks for the link, I'll check it out. I find that the more I encourage good habits of mind, the easier it is for students when we encounter the pure math that may not be so interesting to them.

      Ms. Axthelm
      I'm full of questions but unfortunately not many answers. I think comprehension may be a great example. After all, once a student has been assessed on their comprehension of a particular piece of literature, that passage no longer carries the same characteristics. I see problem solving the same way.

      Sarah said...

      I can't wait to see what you come up with for a rubric. Most of the ones I've seen end up being too specific for the problem/situation it is used for.

      What do good problem solvers do?

      Make meaning of the problem situation (draw picture, make table, act it out, etc)

      Can generalize their meaning of the problem to develop a strategy

      Are flexible with the strategy they choose (can modify or change strategy)

      Have stamina (not looking for the quick trick to solve)

      I'm sure I am leaving something important off of this list. I am aware I left off the idea of communication. It could also be included in a rubric. Could each of these areas be a category on the rubric?

      grace said...

      In my battle against word-problem-anxiety, I taught students to painstakingly annotate every problem they encountered: more than circling numbers and underlining key words, they were expected to brainstorm types of problems that could be represented, algorithms they knew for solving those problems, criteria for determining what type of problem it was, and often, multiple methods of solving. I would model this process orally and in writing, and then expect them to help me fill in the blanks.

      For example, on encountering a word problem about finding a missing angle, students might first draw the picture, realize they were working with a right triangle, and think that they could use similarity, triangle sum theorem, special right triangles, or trig. They'd note which was most appropriate and how they knew it, and then solve. Then they'd use another method to check.

      It was tedious, so I'd give very few of these problems per assignment/assessment, and give credit for good annotation and for the correct answer. As time went on and most students internalized the thought process, I started paying more attention to the answer and less to the work, but struggling students still benefited from making their thinking explicit. Also made it much easier for them to realize when and where they were getting stuck, and built their confidence in their ability to independently solve challenging problems.

      David Cox said...

      Grace
      I've been playing with a rubric that rewards students for problem attack skills, use of the rule of 4, identification and use of skills, and ability to generalize. I'm not sure if the generalization and rule of 4 are redundant or not or if they necessarily apply to all problem solving situations.

      I'm thinking that if student can identify the skills they have used in a problem and/or project and use them correctly, they should be considered proficient in that skill. So if the student has a 1-4 on the skill, then their use of the skill in a problem solving situation validates that skill.

      I think that you're right in that we definitely need to explicitly teach these things. If we think it's important for them to know/do, then they need to be aware of it.

      I also think that this is why I am leaning on giving two types of assessments: skill based and problem solving/application. The problem solving assessments can be one question, but the student needs to be very thorough.

      Anonymous said...

      New at this, but let me know what you think:

      Anonymous said...

      Oops...the link didn't show...just click my username instead.

      Unknown said...

      This is in many part a mirror to the conversation we are currently having. I think about this, and see what answers or questions this leads too. After four years, have you been able to answer your questions any more?